[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 23826 1726867417.15482: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 23826 1726867417.16398: Added group all to inventory 23826 1726867417.16400: Added group ungrouped to inventory 23826 1726867417.16404: Group all now contains ungrouped 23826 1726867417.16407: Examining possible inventory source: /tmp/network-5rw/inventory.yml 23826 1726867417.50527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 23826 1726867417.50810: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 23826 1726867417.50833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 23826 1726867417.50895: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 23826 1726867417.50965: Loaded config def from plugin (inventory/script) 23826 1726867417.50967: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 23826 1726867417.51211: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 23826 1726867417.51298: Loaded config def from plugin (inventory/yaml) 23826 1726867417.51300: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 23826 1726867417.51588: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 23826 1726867417.52418: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 23826 1726867417.52421: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 23826 1726867417.52424: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 23826 1726867417.52430: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 23826 1726867417.52434: Loading data from /tmp/network-5rw/inventory.yml 23826 1726867417.52501: /tmp/network-5rw/inventory.yml was not parsable by auto 23826 1726867417.52822: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 23826 1726867417.52863: Loading data from /tmp/network-5rw/inventory.yml 23826 1726867417.52949: group all already in inventory 23826 1726867417.52956: set inventory_file for managed_node1 23826 1726867417.52960: set inventory_dir for managed_node1 23826 1726867417.52961: Added host managed_node1 to inventory 23826 1726867417.52963: Added host managed_node1 to group all 23826 1726867417.52964: set ansible_host for managed_node1 23826 1726867417.52965: set ansible_ssh_extra_args for managed_node1 23826 1726867417.52968: set inventory_file for managed_node2 23826 1726867417.52970: set inventory_dir for managed_node2 23826 1726867417.52971: Added host managed_node2 to inventory 23826 1726867417.52972: Added host managed_node2 to group all 23826 1726867417.52973: set ansible_host for managed_node2 23826 1726867417.52974: set ansible_ssh_extra_args for managed_node2 23826 1726867417.52976: set inventory_file for managed_node3 23826 1726867417.53182: set inventory_dir for managed_node3 23826 1726867417.53184: Added host managed_node3 to inventory 23826 1726867417.53185: Added host managed_node3 to group all 23826 1726867417.53186: set ansible_host for managed_node3 23826 1726867417.53187: set ansible_ssh_extra_args for managed_node3 23826 1726867417.53190: Reconcile groups and hosts in inventory. 23826 1726867417.53194: Group ungrouped now contains managed_node1 23826 1726867417.53196: Group ungrouped now contains managed_node2 23826 1726867417.53197: Group ungrouped now contains managed_node3 23826 1726867417.53274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 23826 1726867417.53404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 23826 1726867417.53450: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 23826 1726867417.53679: Loaded config def from plugin (vars/host_group_vars) 23826 1726867417.53682: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 23826 1726867417.53689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 23826 1726867417.53696: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 23826 1726867417.53739: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 23826 1726867417.54249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867417.54470: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 23826 1726867417.54511: Loaded config def from plugin (connection/local) 23826 1726867417.54514: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 23826 1726867417.55752: Loaded config def from plugin (connection/paramiko_ssh) 23826 1726867417.55755: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 23826 1726867417.57627: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 23826 1726867417.57666: Loaded config def from plugin (connection/psrp) 23826 1726867417.57668: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 23826 1726867417.59417: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 23826 1726867417.59455: Loaded config def from plugin (connection/ssh) 23826 1726867417.59458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 23826 1726867417.64721: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 23826 1726867417.64762: Loaded config def from plugin (connection/winrm) 23826 1726867417.64765: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 23826 1726867417.64799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 23826 1726867417.64862: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 23826 1726867417.65036: Loaded config def from plugin (shell/cmd) 23826 1726867417.65038: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 23826 1726867417.65064: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 23826 1726867417.65129: Loaded config def from plugin (shell/powershell) 23826 1726867417.65131: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 23826 1726867417.65390: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 23826 1726867417.65665: Loaded config def from plugin (shell/sh) 23826 1726867417.65668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 23826 1726867417.65908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 23826 1726867417.66028: Loaded config def from plugin (become/runas) 23826 1726867417.66030: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 23826 1726867417.66589: Loaded config def from plugin (become/su) 23826 1726867417.66592: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 23826 1726867417.66947: Loaded config def from plugin (become/sudo) 23826 1726867417.66949: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 23826 1726867417.66988: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 23826 1726867417.67613: in VariableManager get_vars() 23826 1726867417.67635: done with get_vars() 23826 1726867417.67766: trying /usr/local/lib/python3.12/site-packages/ansible/modules 23826 1726867417.73967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 23826 1726867417.74288: in VariableManager get_vars() 23826 1726867417.74293: done with get_vars() 23826 1726867417.74296: variable 'playbook_dir' from source: magic vars 23826 1726867417.74297: variable 'ansible_playbook_python' from source: magic vars 23826 1726867417.74297: variable 'ansible_config_file' from source: magic vars 23826 1726867417.74298: variable 'groups' from source: magic vars 23826 1726867417.74299: variable 'omit' from source: magic vars 23826 1726867417.74300: variable 'ansible_version' from source: magic vars 23826 1726867417.74300: variable 'ansible_check_mode' from source: magic vars 23826 1726867417.74301: variable 'ansible_diff_mode' from source: magic vars 23826 1726867417.74302: variable 'ansible_forks' from source: magic vars 23826 1726867417.74303: variable 'ansible_inventory_sources' from source: magic vars 23826 1726867417.74303: variable 'ansible_skip_tags' from source: magic vars 23826 1726867417.74304: variable 'ansible_limit' from source: magic vars 23826 1726867417.74305: variable 'ansible_run_tags' from source: magic vars 23826 1726867417.74305: variable 'ansible_verbosity' from source: magic vars 23826 1726867417.74342: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml 23826 1726867417.75381: in VariableManager get_vars() 23826 1726867417.75398: done with get_vars() 23826 1726867417.75440: in VariableManager get_vars() 23826 1726867417.75462: done with get_vars() 23826 1726867417.75773: in VariableManager get_vars() 23826 1726867417.75792: done with get_vars() 23826 1726867417.75915: in VariableManager get_vars() 23826 1726867417.75929: done with get_vars() 23826 1726867417.75934: variable 'omit' from source: magic vars 23826 1726867417.75951: variable 'omit' from source: magic vars 23826 1726867417.76212: in VariableManager get_vars() 23826 1726867417.76223: done with get_vars() 23826 1726867417.76269: in VariableManager get_vars() 23826 1726867417.76283: done with get_vars() 23826 1726867417.76386: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 23826 1726867417.76735: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 23826 1726867417.77165: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 23826 1726867417.78685: in VariableManager get_vars() 23826 1726867417.78705: done with get_vars() 23826 1726867417.79683: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 23826 1726867417.80023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 23826 1726867417.82974: in VariableManager get_vars() 23826 1726867417.83081: done with get_vars() 23826 1726867417.83084: variable 'playbook_dir' from source: magic vars 23826 1726867417.83085: variable 'ansible_playbook_python' from source: magic vars 23826 1726867417.83086: variable 'ansible_config_file' from source: magic vars 23826 1726867417.83087: variable 'groups' from source: magic vars 23826 1726867417.83087: variable 'omit' from source: magic vars 23826 1726867417.83088: variable 'ansible_version' from source: magic vars 23826 1726867417.83089: variable 'ansible_check_mode' from source: magic vars 23826 1726867417.83089: variable 'ansible_diff_mode' from source: magic vars 23826 1726867417.83090: variable 'ansible_forks' from source: magic vars 23826 1726867417.83090: variable 'ansible_inventory_sources' from source: magic vars 23826 1726867417.83091: variable 'ansible_skip_tags' from source: magic vars 23826 1726867417.83092: variable 'ansible_limit' from source: magic vars 23826 1726867417.83092: variable 'ansible_run_tags' from source: magic vars 23826 1726867417.83093: variable 'ansible_verbosity' from source: magic vars 23826 1726867417.83128: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 23826 1726867417.83348: in VariableManager get_vars() 23826 1726867417.83360: done with get_vars() 23826 1726867417.83402: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 23826 1726867417.83669: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 23826 1726867417.83747: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 23826 1726867417.84298: in VariableManager get_vars() 23826 1726867417.84332: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 23826 1726867417.85827: in VariableManager get_vars() 23826 1726867417.85831: done with get_vars() 23826 1726867417.85833: variable 'playbook_dir' from source: magic vars 23826 1726867417.85834: variable 'ansible_playbook_python' from source: magic vars 23826 1726867417.85834: variable 'ansible_config_file' from source: magic vars 23826 1726867417.85835: variable 'groups' from source: magic vars 23826 1726867417.85836: variable 'omit' from source: magic vars 23826 1726867417.85837: variable 'ansible_version' from source: magic vars 23826 1726867417.85837: variable 'ansible_check_mode' from source: magic vars 23826 1726867417.85838: variable 'ansible_diff_mode' from source: magic vars 23826 1726867417.85839: variable 'ansible_forks' from source: magic vars 23826 1726867417.85839: variable 'ansible_inventory_sources' from source: magic vars 23826 1726867417.85840: variable 'ansible_skip_tags' from source: magic vars 23826 1726867417.85841: variable 'ansible_limit' from source: magic vars 23826 1726867417.85842: variable 'ansible_run_tags' from source: magic vars 23826 1726867417.85842: variable 'ansible_verbosity' from source: magic vars 23826 1726867417.85876: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 23826 1726867417.85946: in VariableManager get_vars() 23826 1726867417.85958: done with get_vars() 23826 1726867417.85998: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 23826 1726867417.86106: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 23826 1726867417.89344: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 23826 1726867417.90158: in VariableManager get_vars() 23826 1726867417.90180: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 23826 1726867417.93281: in VariableManager get_vars() 23826 1726867417.93293: done with get_vars() 23826 1726867417.93327: in VariableManager get_vars() 23826 1726867417.93341: done with get_vars() 23826 1726867417.93376: in VariableManager get_vars() 23826 1726867417.93605: done with get_vars() 23826 1726867417.93641: in VariableManager get_vars() 23826 1726867417.93654: done with get_vars() 23826 1726867417.93719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 23826 1726867417.93733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 23826 1726867417.94167: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 23826 1726867417.94534: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 23826 1726867417.94537: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 23826 1726867417.94568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 23826 1726867417.94798: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 23826 1726867417.95189: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 23826 1726867417.95315: Loaded config def from plugin (callback/default) 23826 1726867417.95317: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 23826 1726867417.97866: Loaded config def from plugin (callback/junit) 23826 1726867417.97868: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 23826 1726867417.98259: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 23826 1726867417.98414: Loaded config def from plugin (callback/minimal) 23826 1726867417.98417: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 23826 1726867417.98456: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 23826 1726867417.98820: Loaded config def from plugin (callback/tree) 23826 1726867417.98822: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 23826 1726867417.98948: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 23826 1726867417.98950: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_disabled_nm.yml ******************************************* 5 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 23826 1726867417.99222: in VariableManager get_vars() 23826 1726867417.99236: done with get_vars() 23826 1726867417.99242: in VariableManager get_vars() 23826 1726867417.99250: done with get_vars() 23826 1726867417.99254: variable 'omit' from source: magic vars 23826 1726867417.99293: in VariableManager get_vars() 23826 1726867417.99307: done with get_vars() 23826 1726867417.99443: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6_disabled.yml' with nm as provider] **** 23826 1726867418.00583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 23826 1726867418.00770: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 23826 1726867418.00802: getting the remaining hosts for this loop 23826 1726867418.00804: done getting the remaining hosts for this loop 23826 1726867418.00806: getting the next task for host managed_node2 23826 1726867418.00810: done getting next task for host managed_node2 23826 1726867418.00811: ^ task is: TASK: Gathering Facts 23826 1726867418.00813: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867418.00815: getting variables 23826 1726867418.00816: in VariableManager get_vars() 23826 1726867418.00826: Calling all_inventory to load vars for managed_node2 23826 1726867418.00828: Calling groups_inventory to load vars for managed_node2 23826 1726867418.00830: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867418.00842: Calling all_plugins_play to load vars for managed_node2 23826 1726867418.00854: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867418.00857: Calling groups_plugins_play to load vars for managed_node2 23826 1726867418.01008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867418.01060: done with get_vars() 23826 1726867418.01067: done getting variables 23826 1726867418.01235: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Friday 20 September 2024 17:23:38 -0400 (0:00:00.023) 0:00:00.023 ****** 23826 1726867418.01257: entering _queue_task() for managed_node2/gather_facts 23826 1726867418.01259: Creating lock for gather_facts 23826 1726867418.02225: worker is 1 (out of 1 available) 23826 1726867418.02233: exiting _queue_task() for managed_node2/gather_facts 23826 1726867418.02246: done queuing things up, now waiting for results queue to drain 23826 1726867418.02247: waiting for pending results... 23826 1726867418.02705: running TaskExecutor() for managed_node2/TASK: Gathering Facts 23826 1726867418.02711: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000a3 23826 1726867418.02826: variable 'ansible_search_path' from source: unknown 23826 1726867418.02857: calling self._execute() 23826 1726867418.03002: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867418.03018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867418.03033: variable 'omit' from source: magic vars 23826 1726867418.03256: variable 'omit' from source: magic vars 23826 1726867418.03372: variable 'omit' from source: magic vars 23826 1726867418.03376: variable 'omit' from source: magic vars 23826 1726867418.03592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867418.03623: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867418.03645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867418.03686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867418.03808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867418.03826: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867418.03834: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867418.03841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867418.04132: Set connection var ansible_timeout to 10 23826 1726867418.04135: Set connection var ansible_shell_executable to /bin/sh 23826 1726867418.04138: Set connection var ansible_connection to ssh 23826 1726867418.04140: Set connection var ansible_pipelining to False 23826 1726867418.04142: Set connection var ansible_shell_type to sh 23826 1726867418.04146: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867418.04171: variable 'ansible_shell_executable' from source: unknown 23826 1726867418.04240: variable 'ansible_connection' from source: unknown 23826 1726867418.04243: variable 'ansible_module_compression' from source: unknown 23826 1726867418.04246: variable 'ansible_shell_type' from source: unknown 23826 1726867418.04248: variable 'ansible_shell_executable' from source: unknown 23826 1726867418.04250: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867418.04252: variable 'ansible_pipelining' from source: unknown 23826 1726867418.04254: variable 'ansible_timeout' from source: unknown 23826 1726867418.04256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867418.04785: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867418.04984: variable 'omit' from source: magic vars 23826 1726867418.04988: starting attempt loop 23826 1726867418.04990: running the handler 23826 1726867418.04992: variable 'ansible_facts' from source: unknown 23826 1726867418.04996: _low_level_execute_command(): starting 23826 1726867418.04998: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867418.06959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867418.07106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867418.07243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867418.07250: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867418.07256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867418.07259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867418.07352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867418.07492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867418.07809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867418.09489: stdout chunk (state=3): >>>/root <<< 23826 1726867418.09623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867418.09625: stdout chunk (state=3): >>><<< 23826 1726867418.09627: stderr chunk (state=3): >>><<< 23826 1726867418.09988: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867418.09991: _low_level_execute_command(): starting 23826 1726867418.09994: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080 `" && echo ansible-tmp-1726867418.098863-23868-42426172992080="` echo /root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080 `" ) && sleep 0' 23826 1726867418.11497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867418.11520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867418.11669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867418.11682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867418.11922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867418.11988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867418.14003: stdout chunk (state=3): >>>ansible-tmp-1726867418.098863-23868-42426172992080=/root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080 <<< 23826 1726867418.14115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867418.14147: stderr chunk (state=3): >>><<< 23826 1726867418.14162: stdout chunk (state=3): >>><<< 23826 1726867418.14245: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867418.098863-23868-42426172992080=/root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867418.14473: variable 'ansible_module_compression' from source: unknown 23826 1726867418.14588: ANSIBALLZ: Using generic lock for ansible.legacy.setup 23826 1726867418.14685: ANSIBALLZ: Acquiring lock 23826 1726867418.14689: ANSIBALLZ: Lock acquired: 139851310993328 23826 1726867418.14691: ANSIBALLZ: Creating module 23826 1726867418.62031: ANSIBALLZ: Writing module into payload 23826 1726867418.62239: ANSIBALLZ: Writing module 23826 1726867418.62259: ANSIBALLZ: Renaming module 23826 1726867418.62265: ANSIBALLZ: Done creating module 23826 1726867418.62366: variable 'ansible_facts' from source: unknown 23826 1726867418.62373: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867418.62591: _low_level_execute_command(): starting 23826 1726867418.62666: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 23826 1726867418.63492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867418.63509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867418.63598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867418.63602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867418.63640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867418.63657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867418.63721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867418.63771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867418.65483: stdout chunk (state=3): >>>PLATFORM <<< 23826 1726867418.65576: stdout chunk (state=3): >>>Linux <<< 23826 1726867418.65591: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 23826 1726867418.65760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867418.65764: stdout chunk (state=3): >>><<< 23826 1726867418.65767: stderr chunk (state=3): >>><<< 23826 1726867418.65989: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867418.65995 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 23826 1726867418.65999: _low_level_execute_command(): starting 23826 1726867418.66001: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 23826 1726867418.66173: Sending initial data 23826 1726867418.66186: Sent initial data (1181 bytes) 23826 1726867418.67495: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867418.67913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867418.67986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867418.71442: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 23826 1726867418.71866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867418.71870: stdout chunk (state=3): >>><<< 23826 1726867418.71876: stderr chunk (state=3): >>><<< 23826 1726867418.71894: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867418.71987: variable 'ansible_facts' from source: unknown 23826 1726867418.71990: variable 'ansible_facts' from source: unknown 23826 1726867418.72000: variable 'ansible_module_compression' from source: unknown 23826 1726867418.72225: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 23826 1726867418.72253: variable 'ansible_facts' from source: unknown 23826 1726867418.72612: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/AnsiballZ_setup.py 23826 1726867418.72901: Sending initial data 23826 1726867418.72905: Sent initial data (152 bytes) 23826 1726867418.74365: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867418.74378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867418.74393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867418.74400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867418.74504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867418.76135: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867418.76191: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867418.76235: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpgpf86pux /root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/AnsiballZ_setup.py <<< 23826 1726867418.76244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/AnsiballZ_setup.py" <<< 23826 1726867418.76396: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpgpf86pux" to remote "/root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/AnsiballZ_setup.py" <<< 23826 1726867418.76399: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/AnsiballZ_setup.py" <<< 23826 1726867418.79340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867418.79549: stderr chunk (state=3): >>><<< 23826 1726867418.79552: stdout chunk (state=3): >>><<< 23826 1726867418.79554: done transferring module to remote 23826 1726867418.79556: _low_level_execute_command(): starting 23826 1726867418.79558: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/ /root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/AnsiballZ_setup.py && sleep 0' 23826 1726867418.80829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867418.80843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867418.80865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867418.80975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867418.81193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867418.81402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867418.81483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867418.83286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867418.83289: stdout chunk (state=3): >>><<< 23826 1726867418.83291: stderr chunk (state=3): >>><<< 23826 1726867418.83322: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867418.83416: _low_level_execute_command(): starting 23826 1726867418.83419: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/AnsiballZ_setup.py && sleep 0' 23826 1726867418.84573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867418.84726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867418.84730: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867418.84732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867418.84734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867418.84736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867418.84747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867418.84797: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867418.84948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867418.85035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867418.87205: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 23826 1726867418.87232: stdout chunk (state=3): >>>import _imp # builtin <<< 23826 1726867418.87270: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 23826 1726867418.87338: stdout chunk (state=3): >>>import '_io' # <<< 23826 1726867418.87358: stdout chunk (state=3): >>>import 'marshal' # <<< 23826 1726867418.87380: stdout chunk (state=3): >>>import 'posix' # <<< 23826 1726867418.87419: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 23826 1726867418.87444: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 23826 1726867418.87508: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867418.87535: stdout chunk (state=3): >>>import '_codecs' # <<< 23826 1726867418.87549: stdout chunk (state=3): >>>import 'codecs' # <<< 23826 1726867418.87568: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 23826 1726867418.87599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7bb84d0><<< 23826 1726867418.87638: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7b87b30> <<< 23826 1726867418.87666: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7bbaa50> import '_signal' # <<< 23826 1726867418.87695: stdout chunk (state=3): >>>import '_abc' # <<< 23826 1726867418.87716: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 23826 1726867418.87757: stdout chunk (state=3): >>>import '_stat' # <<< 23826 1726867418.87773: stdout chunk (state=3): >>>import 'stat' # <<< 23826 1726867418.87826: stdout chunk (state=3): >>>import '_collections_abc' # <<< 23826 1726867418.87886: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 23826 1726867418.87895: stdout chunk (state=3): >>>import 'os' # <<< 23826 1726867418.87965: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 23826 1726867418.87976: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 23826 1726867418.87981: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 23826 1726867418.88001: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7969130> <<< 23826 1726867418.88065: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 23826 1726867418.88068: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867418.88105: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7969fa0> <<< 23826 1726867418.88111: stdout chunk (state=3): >>>import 'site' # <<< 23826 1726867418.88131: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 23826 1726867418.88497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 23826 1726867418.88531: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867418.88557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 23826 1726867418.88599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 23826 1726867418.88638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 23826 1726867418.88642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 23826 1726867418.88672: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a7dd0> <<< 23826 1726867418.88701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 23826 1726867418.88725: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a7fe0> <<< 23826 1726867418.88736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 23826 1726867418.88765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 23826 1726867418.88785: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 23826 1726867418.88832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867418.88855: stdout chunk (state=3): >>>import 'itertools' # <<< 23826 1726867418.88891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79df800> <<< 23826 1726867418.88912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79dfe90> <<< 23826 1726867418.88935: stdout chunk (state=3): >>>import '_collections' # <<< 23826 1726867418.88964: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79bfaa0> <<< 23826 1726867418.88973: stdout chunk (state=3): >>>import '_functools' # <<< 23826 1726867418.89026: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79bd1c0> <<< 23826 1726867418.89126: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a4f80> <<< 23826 1726867418.89139: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 23826 1726867418.89172: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 23826 1726867418.89200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 23826 1726867418.89224: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 23826 1726867418.89247: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79ff6e0> <<< 23826 1726867418.89271: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79fe300> <<< 23826 1726867418.89293: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79be060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a6e70> <<< 23826 1726867418.89341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 23826 1726867418.89586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a347a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a4200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 23826 1726867418.89627: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a34c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a34b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a34ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a2d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a355b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a35280> import 'importlib.machinery' # <<< 23826 1726867418.89653: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a364b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 23826 1726867418.89673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 23826 1726867418.89705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 23826 1726867418.89741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a4c680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867418.89815: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a4dd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 23826 1726867418.89854: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a4ebd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867418.89939: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a4f230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a4e120> <<< 23826 1726867418.89962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a4fcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a4f3e0> <<< 23826 1726867418.89995: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a36450> <<< 23826 1726867418.90033: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 23826 1726867418.90109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 23826 1726867418.90161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7743bc0> <<< 23826 1726867418.90196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 23826 1726867418.90199: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d776c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d776c6b0> <<< 23826 1726867418.90348: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 23826 1726867418.90379: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867418.90425: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d776cfe0> <<< 23826 1726867418.90536: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d776d910> <<< 23826 1726867418.90619: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776c890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7741d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 23826 1726867418.90708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 23826 1726867418.90735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776ecc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776d790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a36ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 23826 1726867418.90766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867418.90794: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 23826 1726867418.90825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 23826 1726867418.90856: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d779b020> <<< 23826 1726867418.90966: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 23826 1726867418.90995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 23826 1726867418.91053: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d77bb3e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 23826 1726867418.91080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 23826 1726867418.91115: stdout chunk (state=3): >>>import 'ntpath' # <<< 23826 1726867418.91153: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 23826 1726867418.91180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d781c200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 23826 1726867418.91183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 23826 1726867418.91206: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 23826 1726867418.91250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 23826 1726867418.91351: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d781e960> <<< 23826 1726867418.91492: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d781c320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d77e91f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71292e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d77ba1e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776fbf0> <<< 23826 1726867418.91702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 23826 1726867418.91711: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f35d77ba300> <<< 23826 1726867418.92018: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_rskpu21a/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 23826 1726867418.92063: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.92114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 23826 1726867418.92235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 23826 1726867418.92268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d718ef30> import '_typing' # <<< 23826 1726867418.92431: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d716de50> <<< 23826 1726867418.92450: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d716d010> # zipimport: zlib available <<< 23826 1726867418.92507: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867418.92533: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 23826 1726867418.92543: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.93924: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.95541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d718ce00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d71c68d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71c6660> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71c5f70> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71c69c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d718fbc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d71c75f0> <<< 23826 1726867418.95545: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d71c7770> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 23826 1726867418.95548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71c7cb0> import 'pwd' # <<< 23826 1726867418.95550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 23826 1726867418.95570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 23826 1726867418.95655: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7029a90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867418.95705: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d702b6b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 23826 1726867418.95785: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702bf80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702d220> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 23826 1726867418.95923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 23826 1726867418.96086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702fd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d779af90> <<< 23826 1726867418.96090: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702dfd0> <<< 23826 1726867418.96173: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 23826 1726867418.96207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 23826 1726867418.96357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7037cb0> import '_tokenize' # <<< 23826 1726867418.96440: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7036780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d70364e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7036a50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702e4e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d707bf20> <<< 23826 1726867418.96674: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d707c080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d707db50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d707d910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 23826 1726867418.96767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d707ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d707e1e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 23826 1726867418.96785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 23826 1726867418.96798: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7083860> <<< 23826 1726867418.96933: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7080230> <<< 23826 1726867418.97282: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7084920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7084740> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7084a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d707c290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6f10110> <<< 23826 1726867418.97391: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6f11280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d70868d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7087c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d70864e0> <<< 23826 1726867418.97424: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 23826 1726867418.97514: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.97703: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.97725: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 23826 1726867418.97790: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.97932: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.98460: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.98993: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 23826 1726867418.99074: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867418.99194: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6f15580> <<< 23826 1726867418.99220: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f16420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f114c0> <<< 23826 1726867418.99308: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867418.99311: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 23826 1726867418.99356: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.99473: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867418.99666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 23826 1726867418.99671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f16510> # zipimport: zlib available <<< 23826 1726867419.00134: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.00487: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.00621: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.00718: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 23826 1726867419.00756: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 23826 1726867419.00784: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.00961: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 23826 1726867419.00964: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.00990: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.01003: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 23826 1726867419.01221: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.01609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 23826 1726867419.01622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f175f0> # zipimport: zlib available <<< 23826 1726867419.01670: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.01742: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 23826 1726867419.01766: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 23826 1726867419.01835: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.01864: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 23826 1726867419.01905: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.02002: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.02075: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 23826 1726867419.02111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867419.02198: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6f21ee0> <<< 23826 1726867419.02273: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f1f290> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 23826 1726867419.02346: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.02399: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.02482: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.02513: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 23826 1726867419.02537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 23826 1726867419.02612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 23826 1726867419.02631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 23826 1726867419.02717: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d700a9c0> <<< 23826 1726867419.02916: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d70fe690> <<< 23826 1726867419.02933: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f220c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f21ca0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 23826 1726867419.02972: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 23826 1726867419.03035: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03093: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03130: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.03174: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03221: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03254: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 23826 1726867419.03305: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03456: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03485: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.03506: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 23826 1726867419.03687: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03856: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03897: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.03951: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867419.04038: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 23826 1726867419.04107: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb60c0> <<< 23826 1726867419.04182: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 23826 1726867419.04226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6b800e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867419.04250: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6b804d0> <<< 23826 1726867419.04274: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f9c050> <<< 23826 1726867419.04301: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb6c30> <<< 23826 1726867419.04622: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb4830> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb43b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6b834d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6b82d80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6b82f60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6b821b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 23826 1726867419.04841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6b835f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6bce0f0> <<< 23826 1726867419.04939: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bcc110> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb5850> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 23826 1726867419.05071: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 23826 1726867419.05183: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 23826 1726867419.05217: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.05285: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 23826 1726867419.05309: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.05628: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.05632: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.05741: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 23826 1726867419.06200: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.06739: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 23826 1726867419.06758: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.06864: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.06931: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.07125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 23826 1726867419.07143: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 23826 1726867419.07195: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.07315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 23826 1726867419.07447: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bcf650> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 23826 1726867419.07522: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bcecf0> import 'ansible.module_utils.facts.system.local' # <<< 23826 1726867419.07563: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.07636: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.07722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 23826 1726867419.07737: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.07809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 23826 1726867419.07958: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 23826 1726867419.08004: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.08071: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 23826 1726867419.08390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867419.08448: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6c0e240> <<< 23826 1726867419.08465: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bfdfa0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 23826 1726867419.08483: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.08542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 23826 1726867419.08700: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.08831: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.08983: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 23826 1726867419.09002: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.09046: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 23826 1726867419.09153: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 23826 1726867419.09197: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867419.09513: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6c21af0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bff1d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.09552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.09632: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 23826 1726867419.09653: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.09788: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.09845: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.09882: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.09929: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 23826 1726867419.09968: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.09992: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.10120: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.10279: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 23826 1726867419.10398: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.10519: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 23826 1726867419.10532: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.10554: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.10602: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.11149: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.11700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 23826 1726867419.11899: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 23826 1726867419.12026: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.12168: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 23826 1726867419.12249: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.12433: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 23826 1726867419.12460: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.12489: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.12532: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 23826 1726867419.12544: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.12900: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.12937: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.13128: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 23826 1726867419.13157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 23826 1726867419.13179: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.13258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 23826 1726867419.13272: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.13296: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 23826 1726867419.13354: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.13532: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 23826 1726867419.13536: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.13598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 23826 1726867419.13709: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.13730: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 23826 1726867419.13983: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.14255: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 23826 1726867419.14269: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.14341: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.14387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 23826 1726867419.14399: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.14426: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.14474: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 23826 1726867419.14512: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.14537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 23826 1726867419.14708: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.14716: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 23826 1726867419.14783: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.14839: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.14842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 23826 1726867419.15026: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.15044: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.15273: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867419.15299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 23826 1726867419.15320: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.15371: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 23826 1726867419.15710: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.15758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 23826 1726867419.15771: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.15809: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.15859: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 23826 1726867419.15919: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.15976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 23826 1726867419.16067: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.16129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 23826 1726867419.16216: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.16246: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.16478: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 23826 1726867419.16506: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867419.17052: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 23826 1726867419.17089: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d69ba5d0> <<< 23826 1726867419.17158: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d69b8710> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d69b2180> <<< 23826 1726867419.32434: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6a012e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6a02090> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867419.32996: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6c10680> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6c10170> <<< 23826 1726867419.33000: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 23826 1726867419.53134: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "23", "second": "39", "epoch": "1726867419", "epoch_int": "1726867419", "date": "2024-09-20", "time": "17:23:39", "iso8601_micro": "2024-09-20T21:23:39.177186Z", "iso8601": "2024-09-20T21:23:39Z", "iso8601_basic": "20240920T172339177186", "iso8601_basic_short": "20240920T172339", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.2998046875, "5m": 0.34765625, "15m": 0.20361328125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 657, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794508800, "block_size": 4096, "block_total": 65519099, "block_available": 63914675, "block_used": 1604424, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 23826 1726867419.53589: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 23826 1726867419.53643: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 23826 1726867419.53684: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors <<< 23826 1726867419.53735: stdout chunk (state=3): >>># cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 23826 1726867419.53767: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline <<< 23826 1726867419.53802: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly <<< 23826 1726867419.54079: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 23826 1726867419.54313: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 23826 1726867419.54585: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 23826 1726867419.54621: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux <<< 23826 1726867419.54765: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 23826 1726867419.54775: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 23826 1726867419.54802: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc<<< 23826 1726867419.54806: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 23826 1726867419.54824: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants <<< 23826 1726867419.54828: stdout chunk (state=3): >>># destroy re._casefix # destroy re._compiler # destroy enum <<< 23826 1726867419.54845: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 23826 1726867419.54991: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 23826 1726867419.55009: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 23826 1726867419.55048: stdout chunk (state=3): >>># destroy sys.monitoring <<< 23826 1726867419.55060: stdout chunk (state=3): >>># destroy _socket <<< 23826 1726867419.55065: stdout chunk (state=3): >>># destroy _collections <<< 23826 1726867419.55121: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 23826 1726867419.55129: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 23826 1726867419.55162: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 23826 1726867419.55608: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 23826 1726867419.55792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867419.55840: stderr chunk (state=3): >>><<< 23826 1726867419.55850: stdout chunk (state=3): >>><<< 23826 1726867419.56274: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7bb84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7b87b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7bbaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7969130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7969fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a7dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a7fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79df800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79dfe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79bfaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79bd1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a4f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79ff6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79fe300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79be060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a6e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a347a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a4200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a34c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a34b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a34ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d79a2d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a355b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a35280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a364b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a4c680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a4dd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a4ebd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a4f230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a4e120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7a4fcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a4f3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a36450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7743bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d776c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d776c6b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d776cfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d776d910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776c890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7741d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776ecc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776d790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7a36ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d779b020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d77bb3e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d781c200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d781e960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d781c320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d77e91f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71292e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d77ba1e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d776fbf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f35d77ba300> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_rskpu21a/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d718ef30> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d716de50> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d716d010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d718ce00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d71c68d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71c6660> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71c5f70> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71c69c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d718fbc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d71c75f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d71c7770> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d71c7cb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7029a90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d702b6b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702bf80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702d220> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702fd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d779af90> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702dfd0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7037cb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7036780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d70364e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7036a50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d702e4e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d707bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d707c080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d707db50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d707d910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d707ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d707e1e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7083860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d7080230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7084920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7084740> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7084a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d707c290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6f10110> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6f11280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d70868d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d7087c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d70864e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6f15580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f16420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f114c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f16510> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f175f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6f21ee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f1f290> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d700a9c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d70fe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f220c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f21ca0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb60c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6b800e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6b804d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6f9c050> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb6c30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb4830> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb43b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6b834d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6b82d80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6b82f60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6b821b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6b835f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6bce0f0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bcc110> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6fb5850> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bcf650> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bcecf0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6c0e240> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bfdfa0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d6c21af0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6bff1d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35d69ba5d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d69b8710> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d69b2180> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6a012e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6a02090> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6c10680> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35d6c10170> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "23", "second": "39", "epoch": "1726867419", "epoch_int": "1726867419", "date": "2024-09-20", "time": "17:23:39", "iso8601_micro": "2024-09-20T21:23:39.177186Z", "iso8601": "2024-09-20T21:23:39Z", "iso8601_basic": "20240920T172339177186", "iso8601_basic_short": "20240920T172339", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.2998046875, "5m": 0.34765625, "15m": 0.20361328125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 657, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794508800, "block_size": 4096, "block_total": 65519099, "block_available": 63914675, "block_used": 1604424, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 23826 1726867419.59725: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867419.59743: _low_level_execute_command(): starting 23826 1726867419.59746: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867418.098863-23868-42426172992080/ > /dev/null 2>&1 && sleep 0' 23826 1726867419.61325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867419.61486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867419.61492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867419.61551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867419.61557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867419.61623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867419.61640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867419.63807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867419.63817: stdout chunk (state=3): >>><<< 23826 1726867419.63826: stderr chunk (state=3): >>><<< 23826 1726867419.63849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867419.63911: handler run complete 23826 1726867419.64119: variable 'ansible_facts' from source: unknown 23826 1726867419.64233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867419.64599: variable 'ansible_facts' from source: unknown 23826 1726867419.64707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867419.64865: attempt loop complete, returning result 23826 1726867419.64874: _execute() done 23826 1726867419.64882: dumping result to json 23826 1726867419.64915: done dumping result, returning 23826 1726867419.64928: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-a92d-a3ea-0000000000a3] 23826 1726867419.64935: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000a3 23826 1726867419.65673: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000a3 23826 1726867419.65676: WORKER PROCESS EXITING ok: [managed_node2] 23826 1726867419.66147: no more pending results, returning what we have 23826 1726867419.66153: results queue empty 23826 1726867419.66154: checking for any_errors_fatal 23826 1726867419.66155: done checking for any_errors_fatal 23826 1726867419.66156: checking for max_fail_percentage 23826 1726867419.66157: done checking for max_fail_percentage 23826 1726867419.66158: checking to see if all hosts have failed and the running result is not ok 23826 1726867419.66159: done checking to see if all hosts have failed 23826 1726867419.66160: getting the remaining hosts for this loop 23826 1726867419.66161: done getting the remaining hosts for this loop 23826 1726867419.66165: getting the next task for host managed_node2 23826 1726867419.66170: done getting next task for host managed_node2 23826 1726867419.66172: ^ task is: TASK: meta (flush_handlers) 23826 1726867419.66174: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867419.66180: getting variables 23826 1726867419.66182: in VariableManager get_vars() 23826 1726867419.66202: Calling all_inventory to load vars for managed_node2 23826 1726867419.66212: Calling groups_inventory to load vars for managed_node2 23826 1726867419.66216: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867419.66225: Calling all_plugins_play to load vars for managed_node2 23826 1726867419.66228: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867419.66231: Calling groups_plugins_play to load vars for managed_node2 23826 1726867419.66436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867419.66660: done with get_vars() 23826 1726867419.66674: done getting variables 23826 1726867419.66889: in VariableManager get_vars() 23826 1726867419.66898: Calling all_inventory to load vars for managed_node2 23826 1726867419.66899: Calling groups_inventory to load vars for managed_node2 23826 1726867419.66901: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867419.66905: Calling all_plugins_play to load vars for managed_node2 23826 1726867419.66907: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867419.66909: Calling groups_plugins_play to load vars for managed_node2 23826 1726867419.67089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867419.67273: done with get_vars() 23826 1726867419.67287: done queuing things up, now waiting for results queue to drain 23826 1726867419.67289: results queue empty 23826 1726867419.67290: checking for any_errors_fatal 23826 1726867419.67292: done checking for any_errors_fatal 23826 1726867419.67293: checking for max_fail_percentage 23826 1726867419.67294: done checking for max_fail_percentage 23826 1726867419.67298: checking to see if all hosts have failed and the running result is not ok 23826 1726867419.67299: done checking to see if all hosts have failed 23826 1726867419.67300: getting the remaining hosts for this loop 23826 1726867419.67300: done getting the remaining hosts for this loop 23826 1726867419.67303: getting the next task for host managed_node2 23826 1726867419.67309: done getting next task for host managed_node2 23826 1726867419.67312: ^ task is: TASK: Include the task 'el_repo_setup.yml' 23826 1726867419.67313: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867419.67315: getting variables 23826 1726867419.67316: in VariableManager get_vars() 23826 1726867419.67324: Calling all_inventory to load vars for managed_node2 23826 1726867419.67326: Calling groups_inventory to load vars for managed_node2 23826 1726867419.67328: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867419.67333: Calling all_plugins_play to load vars for managed_node2 23826 1726867419.67335: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867419.67337: Calling groups_plugins_play to load vars for managed_node2 23826 1726867419.67491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867419.67671: done with get_vars() 23826 1726867419.67680: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:11 Friday 20 September 2024 17:23:39 -0400 (0:00:01.664) 0:00:01.688 ****** 23826 1726867419.67757: entering _queue_task() for managed_node2/include_tasks 23826 1726867419.67761: Creating lock for include_tasks 23826 1726867419.68027: worker is 1 (out of 1 available) 23826 1726867419.68036: exiting _queue_task() for managed_node2/include_tasks 23826 1726867419.68046: done queuing things up, now waiting for results queue to drain 23826 1726867419.68047: waiting for pending results... 23826 1726867419.68536: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 23826 1726867419.68541: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000006 23826 1726867419.68544: variable 'ansible_search_path' from source: unknown 23826 1726867419.68668: calling self._execute() 23826 1726867419.68804: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867419.68817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867419.68840: variable 'omit' from source: magic vars 23826 1726867419.69189: _execute() done 23826 1726867419.69192: dumping result to json 23826 1726867419.69195: done dumping result, returning 23826 1726867419.69197: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-a92d-a3ea-000000000006] 23826 1726867419.69199: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000006 23826 1726867419.69311: no more pending results, returning what we have 23826 1726867419.69316: in VariableManager get_vars() 23826 1726867419.69348: Calling all_inventory to load vars for managed_node2 23826 1726867419.69351: Calling groups_inventory to load vars for managed_node2 23826 1726867419.69354: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867419.69367: Calling all_plugins_play to load vars for managed_node2 23826 1726867419.69369: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867419.69372: Calling groups_plugins_play to load vars for managed_node2 23826 1726867419.69909: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000006 23826 1726867419.69913: WORKER PROCESS EXITING 23826 1726867419.69935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867419.70348: done with get_vars() 23826 1726867419.70355: variable 'ansible_search_path' from source: unknown 23826 1726867419.70368: we have included files to process 23826 1726867419.70369: generating all_blocks data 23826 1726867419.70371: done generating all_blocks data 23826 1726867419.70371: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 23826 1726867419.70373: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 23826 1726867419.70376: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 23826 1726867419.71068: in VariableManager get_vars() 23826 1726867419.71089: done with get_vars() 23826 1726867419.71101: done processing included file 23826 1726867419.71103: iterating over new_blocks loaded from include file 23826 1726867419.71105: in VariableManager get_vars() 23826 1726867419.71117: done with get_vars() 23826 1726867419.71119: filtering new block on tags 23826 1726867419.71133: done filtering new block on tags 23826 1726867419.71136: in VariableManager get_vars() 23826 1726867419.71145: done with get_vars() 23826 1726867419.71146: filtering new block on tags 23826 1726867419.71161: done filtering new block on tags 23826 1726867419.71164: in VariableManager get_vars() 23826 1726867419.71174: done with get_vars() 23826 1726867419.71175: filtering new block on tags 23826 1726867419.71194: done filtering new block on tags 23826 1726867419.71197: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 23826 1726867419.71202: extending task lists for all hosts with included blocks 23826 1726867419.71253: done extending task lists 23826 1726867419.71255: done processing included files 23826 1726867419.71255: results queue empty 23826 1726867419.71256: checking for any_errors_fatal 23826 1726867419.71257: done checking for any_errors_fatal 23826 1726867419.71258: checking for max_fail_percentage 23826 1726867419.71259: done checking for max_fail_percentage 23826 1726867419.71259: checking to see if all hosts have failed and the running result is not ok 23826 1726867419.71260: done checking to see if all hosts have failed 23826 1726867419.71261: getting the remaining hosts for this loop 23826 1726867419.71262: done getting the remaining hosts for this loop 23826 1726867419.71264: getting the next task for host managed_node2 23826 1726867419.71268: done getting next task for host managed_node2 23826 1726867419.71270: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 23826 1726867419.71272: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867419.71274: getting variables 23826 1726867419.71275: in VariableManager get_vars() 23826 1726867419.71285: Calling all_inventory to load vars for managed_node2 23826 1726867419.71287: Calling groups_inventory to load vars for managed_node2 23826 1726867419.71289: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867419.71293: Calling all_plugins_play to load vars for managed_node2 23826 1726867419.71298: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867419.71301: Calling groups_plugins_play to load vars for managed_node2 23826 1726867419.71457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867419.71642: done with get_vars() 23826 1726867419.71651: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:23:39 -0400 (0:00:00.039) 0:00:01.728 ****** 23826 1726867419.71725: entering _queue_task() for managed_node2/setup 23826 1726867419.72091: worker is 1 (out of 1 available) 23826 1726867419.72103: exiting _queue_task() for managed_node2/setup 23826 1726867419.72117: done queuing things up, now waiting for results queue to drain 23826 1726867419.72119: waiting for pending results... 23826 1726867419.72282: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 23826 1726867419.72395: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000b4 23826 1726867419.72419: variable 'ansible_search_path' from source: unknown 23826 1726867419.72427: variable 'ansible_search_path' from source: unknown 23826 1726867419.72473: calling self._execute() 23826 1726867419.72542: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867419.72552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867419.72570: variable 'omit' from source: magic vars 23826 1726867419.73075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867419.75230: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867419.75384: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867419.75388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867419.75415: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867419.75449: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867419.75538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867419.75572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867419.75604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867419.75656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867419.75678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867419.76010: variable 'ansible_facts' from source: unknown 23826 1726867419.76121: variable 'network_test_required_facts' from source: task vars 23826 1726867419.76162: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 23826 1726867419.76184: variable 'omit' from source: magic vars 23826 1726867419.76225: variable 'omit' from source: magic vars 23826 1726867419.76261: variable 'omit' from source: magic vars 23826 1726867419.76296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867419.76332: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867419.76353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867419.76375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867419.76401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867419.76440: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867419.76443: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867419.76446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867419.76553: Set connection var ansible_timeout to 10 23826 1726867419.76568: Set connection var ansible_shell_executable to /bin/sh 23826 1726867419.76575: Set connection var ansible_connection to ssh 23826 1726867419.76587: Set connection var ansible_pipelining to False 23826 1726867419.76594: Set connection var ansible_shell_type to sh 23826 1726867419.76603: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867419.76633: variable 'ansible_shell_executable' from source: unknown 23826 1726867419.76652: variable 'ansible_connection' from source: unknown 23826 1726867419.76655: variable 'ansible_module_compression' from source: unknown 23826 1726867419.76658: variable 'ansible_shell_type' from source: unknown 23826 1726867419.76660: variable 'ansible_shell_executable' from source: unknown 23826 1726867419.76722: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867419.76726: variable 'ansible_pipelining' from source: unknown 23826 1726867419.76728: variable 'ansible_timeout' from source: unknown 23826 1726867419.76731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867419.76849: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867419.76871: variable 'omit' from source: magic vars 23826 1726867419.76884: starting attempt loop 23826 1726867419.76891: running the handler 23826 1726867419.76910: _low_level_execute_command(): starting 23826 1726867419.76921: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867419.77884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867419.77969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867419.78004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867419.80162: stdout chunk (state=3): >>>/root <<< 23826 1726867419.80503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867419.80509: stdout chunk (state=3): >>><<< 23826 1726867419.80512: stderr chunk (state=3): >>><<< 23826 1726867419.80515: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867419.80524: _low_level_execute_command(): starting 23826 1726867419.80527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778 `" && echo ansible-tmp-1726867419.8042023-23936-171661330374778="` echo /root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778 `" ) && sleep 0' 23826 1726867419.81069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867419.81085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867419.81101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867419.81133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867419.81149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867419.81238: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867419.81270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867419.81300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867419.81370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867419.84173: stdout chunk (state=3): >>>ansible-tmp-1726867419.8042023-23936-171661330374778=/root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778 <<< 23826 1726867419.84364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867419.84374: stdout chunk (state=3): >>><<< 23826 1726867419.84397: stderr chunk (state=3): >>><<< 23826 1726867419.84583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867419.8042023-23936-171661330374778=/root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867419.84586: variable 'ansible_module_compression' from source: unknown 23826 1726867419.84589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 23826 1726867419.84606: variable 'ansible_facts' from source: unknown 23826 1726867419.84837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/AnsiballZ_setup.py 23826 1726867419.84997: Sending initial data 23826 1726867419.85006: Sent initial data (154 bytes) 23826 1726867419.85692: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867419.85727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867419.85744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867419.85759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867419.85836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867419.88165: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867419.88219: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867419.88261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpkmzkay2b /root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/AnsiballZ_setup.py <<< 23826 1726867419.88281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/AnsiballZ_setup.py" <<< 23826 1726867419.88356: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpkmzkay2b" to remote "/root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/AnsiballZ_setup.py" <<< 23826 1726867419.91947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867419.91963: stderr chunk (state=3): >>><<< 23826 1726867419.92001: stdout chunk (state=3): >>><<< 23826 1726867419.92186: done transferring module to remote 23826 1726867419.92189: _low_level_execute_command(): starting 23826 1726867419.92192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/ /root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/AnsiballZ_setup.py && sleep 0' 23826 1726867419.93185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867419.93483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867419.93503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867419.93524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867419.93551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867419.93665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867419.96347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867419.96351: stdout chunk (state=3): >>><<< 23826 1726867419.96353: stderr chunk (state=3): >>><<< 23826 1726867419.96369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867419.96376: _low_level_execute_command(): starting 23826 1726867419.96390: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/AnsiballZ_setup.py && sleep 0' 23826 1726867419.97596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867419.97746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867419.97790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867419.97812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867419.97964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867420.01525: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # <<< 23826 1726867420.01547: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 23826 1726867420.01700: stdout chunk (state=3): >>># installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 23826 1726867420.01731: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 23826 1726867420.01734: stdout chunk (state=3): >>> <<< 23826 1726867420.01833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 23826 1726867420.01944: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c45104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c44dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 23826 1726867420.02010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4512a50> import '_signal' # import '_abc' # import 'abc' # <<< 23826 1726867420.02157: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 23826 1726867420.02215: stdout chunk (state=3): >>>import '_collections_abc' # <<< 23826 1726867420.02221: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 23826 1726867420.02287: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 23826 1726867420.02361: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42c1130> <<< 23826 1726867420.02480: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 23826 1726867420.02484: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42c1fa0> <<< 23826 1726867420.02497: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 23826 1726867420.02842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 23826 1726867420.02858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 23826 1726867420.02874: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 23826 1726867420.03108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42ffe90> <<< 23826 1726867420.03113: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42fff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 23826 1726867420.03118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 23826 1726867420.03138: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 23826 1726867420.03190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867420.03203: stdout chunk (state=3): >>>import 'itertools' # <<< 23826 1726867420.03229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 23826 1726867420.03295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4337890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4337f20> import '_collections' # <<< 23826 1726867420.03332: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4317b60> <<< 23826 1726867420.03346: stdout chunk (state=3): >>>import '_functools' # <<< 23826 1726867420.03363: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4315280> <<< 23826 1726867420.03497: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42fd040> <<< 23826 1726867420.03560: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 23826 1726867420.03579: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 23826 1726867420.03594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 23826 1726867420.03636: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4357800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4356420> <<< 23826 1726867420.03685: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4316150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4354c80> <<< 23826 1726867420.03733: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42fc2c0> <<< 23826 1726867420.04034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c438cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438cbf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c438cfe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42fade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438d6d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438d3a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 23826 1726867420.04037: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438e5d0> import 'importlib.util' # import 'runpy' # <<< 23826 1726867420.04118: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 23826 1726867420.04135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c43a47a0> import 'errno' # <<< 23826 1726867420.04316: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c43a5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 23826 1726867420.04502: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c43a6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c43a7380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c43a62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c43a7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c43a7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438e570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 23826 1726867420.04563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40afce0> <<< 23826 1726867420.04658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40d8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40d84a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40d8770> <<< 23826 1726867420.04697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 23826 1726867420.04804: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.04955: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40d90a0> <<< 23826 1726867420.05145: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40d9a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40d8950> <<< 23826 1726867420.05201: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40ade80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 23826 1726867420.05244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 23826 1726867420.05283: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40dae10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40d98e0> <<< 23826 1726867420.05329: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438ecc0> <<< 23826 1726867420.05335: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 23826 1726867420.05428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867420.05473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 23826 1726867420.05515: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4103170> <<< 23826 1726867420.05581: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 23826 1726867420.05598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867420.05640: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 23826 1726867420.05781: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c41274d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 23826 1726867420.05857: stdout chunk (state=3): >>>import 'ntpath' # <<< 23826 1726867420.05895: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c41882f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 23826 1726867420.05968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 23826 1726867420.06014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 23826 1726867420.06144: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c418aa20> <<< 23826 1726867420.06296: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c41883e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c414d2e0> <<< 23826 1726867420.06330: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3f953d0> <<< 23826 1726867420.06359: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4126300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40dbd40> <<< 23826 1726867420.06664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f55c4126660> <<< 23826 1726867420.07028: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_til5fzcn/ansible_setup_payload.zip' # zipimport: zlib available <<< 23826 1726867420.07238: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.07274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 23826 1726867420.07520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3fff0e0> import '_typing' # <<< 23826 1726867420.07567: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3fddfd0> <<< 23826 1726867420.07573: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3fdd160> # zipimport: zlib available <<< 23826 1726867420.07617: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.07664: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 23826 1726867420.09268: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.11323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3ffd760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 23826 1726867420.11472: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c402eab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c402e840> <<< 23826 1726867420.11547: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c402e150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c402e5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c45129c0> import 'atexit' # <<< 23826 1726867420.11587: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c402f7a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c402f9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 23826 1726867420.11695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c402ff20> import 'pwd' # <<< 23826 1726867420.11721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 23826 1726867420.11803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c392dc10> <<< 23826 1726867420.11831: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c392f830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 23826 1726867420.11854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 23826 1726867420.11994: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3930200> <<< 23826 1726867420.12031: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3931370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 23826 1726867420.12135: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3933e90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.12231: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c41030e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3932150> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 23826 1726867420.12261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 23826 1726867420.12505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c393bdd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c393a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c393a600> <<< 23826 1726867420.12540: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 23826 1726867420.12553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 23826 1726867420.12793: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c393ab70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3932660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c393ab10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c39800e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 23826 1726867420.12797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 23826 1726867420.12799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 23826 1726867420.12933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3981ca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3981a60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 23826 1726867420.12945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3984260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3982390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 23826 1726867420.12971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867420.13065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3987a40> <<< 23826 1726867420.13198: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3984410> <<< 23826 1726867420.13255: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.13401: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3988830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.13445: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3988aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c39883e0> <<< 23826 1726867420.13531: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c39803b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3814320> <<< 23826 1726867420.13637: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.13640: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3815820> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c398aab0> <<< 23826 1726867420.13884: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c398be60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c398a6f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.13923: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 23826 1726867420.13947: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.13981: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 23826 1726867420.14222: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.14756: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.15299: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 23826 1726867420.15414: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 23826 1726867420.15417: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867420.15435: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3819a30> <<< 23826 1726867420.15485: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 23826 1726867420.15547: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c381a840> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3815c70> <<< 23826 1726867420.15621: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 23826 1726867420.15653: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.15783: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.15971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 23826 1726867420.16055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c381a5a0> # zipimport: zlib available <<< 23826 1726867420.16785: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.16912: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.16926: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.17139: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.17242: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 23826 1726867420.17264: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 23826 1726867420.17305: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.17354: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 23826 1726867420.17728: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.17817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 23826 1726867420.18127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c381b9b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 23826 1726867420.18152: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18175: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18235: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 23826 1726867420.18280: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18323: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18373: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18444: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 23826 1726867420.18485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867420.18571: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3826420> <<< 23826 1726867420.18613: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3821b80> <<< 23826 1726867420.18657: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 23826 1726867420.18669: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18722: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18783: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18812: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.18896: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 23826 1726867420.18911: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 23826 1726867420.18982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 23826 1726867420.19014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 23826 1726867420.19028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 23826 1726867420.19094: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c390ec00> <<< 23826 1726867420.19126: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c405a8d0> <<< 23826 1726867420.19223: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3826180> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3989040> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 23826 1726867420.19330: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.19336: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 23826 1726867420.19363: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.19387: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 23826 1726867420.19536: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.19553: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.19564: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.19598: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.19749: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 23826 1726867420.19798: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.19872: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.19893: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.19966: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 23826 1726867420.20194: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.20299: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.20339: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.20400: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 23826 1726867420.20424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 23826 1726867420.20452: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 23826 1726867420.20534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b6870> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 23826 1726867420.20549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 23826 1726867420.20561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 23826 1726867420.20655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 23826 1726867420.20658: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c34f8290> <<< 23826 1726867420.20685: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.20793: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c34f85c0> <<< 23826 1726867420.20810: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38a3620> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b7380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b4f50> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b4b90> <<< 23826 1726867420.20840: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 23826 1726867420.21111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c34fb650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c34faf00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c34fb0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c34fa330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 23826 1726867420.21161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 23826 1726867420.21187: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c34fb830> <<< 23826 1726867420.21201: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 23826 1726867420.21232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 23826 1726867420.21261: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3542360> <<< 23826 1726867420.21298: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3540380> <<< 23826 1726867420.21339: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b4c80> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 23826 1726867420.21440: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 23826 1726867420.21460: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.21509: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 23826 1726867420.21536: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.21583: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.21630: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 23826 1726867420.21666: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.21786: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 23826 1726867420.21789: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 23826 1726867420.21802: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.21846: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 23826 1726867420.21872: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.21904: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.21969: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 23826 1726867420.22101: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.22119: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.22141: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.22211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 23826 1726867420.22752: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 23826 1726867420.23149: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23194: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23251: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23292: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23329: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 23826 1726867420.23373: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 23826 1726867420.23474: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.23540: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 23826 1726867420.23553: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23576: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 23826 1726867420.23622: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23686: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 23826 1726867420.23697: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23771: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.23866: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 23826 1726867420.23970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 23826 1726867420.23985: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3543bf0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 23826 1726867420.24084: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c35431d0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 23826 1726867420.24194: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.24298: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 23826 1726867420.24304: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.24318: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.24416: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 23826 1726867420.24484: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.24512: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.24609: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 23826 1726867420.24612: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.24631: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.24669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 23826 1726867420.24741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 23826 1726867420.24794: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.25028: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3586660> <<< 23826 1726867420.25060: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3577500> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 23826 1726867420.25121: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.25171: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 23826 1726867420.25195: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.25274: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.25366: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.25483: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.25615: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 23826 1726867420.25635: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.25675: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.25721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 23826 1726867420.25767: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.25815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 23826 1726867420.25843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.25869: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3599e50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3599a60> <<< 23826 1726867420.25936: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 23826 1726867420.25964: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.26055: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 23826 1726867420.26441: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 23826 1726867420.26448: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.26544: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.26594: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.26646: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 23826 1726867420.26666: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.26928: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.26989: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 23826 1726867420.27002: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.27248: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # <<< 23826 1726867420.27268: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.27360: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.27385: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.27888: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.28400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 23826 1726867420.28417: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.28627: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # <<< 23826 1726867420.28641: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.28766: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.28842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 23826 1726867420.28845: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.29214: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 23826 1726867420.29217: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 23826 1726867420.29248: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.29380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 23826 1726867420.29399: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.29495: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.29702: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.29913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 23826 1726867420.29926: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.29956: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.29991: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 23826 1726867420.30022: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.30036: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.30066: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 23826 1726867420.30141: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.30211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 23826 1726867420.30242: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.30245: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.30275: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 23826 1726867420.30341: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.30453: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 23826 1726867420.30474: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.30536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 23826 1726867420.30539: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.30796: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.31064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 23826 1726867420.31155: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.31207: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 23826 1726867420.31380: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.31384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 23826 1726867420.31396: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.31427: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 23826 1726867420.31442: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.31514: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.31611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 23826 1726867420.31708: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867420.31728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 23826 1726867420.31752: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32006: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32012: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32056: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 23826 1726867420.32059: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32106: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 23826 1726867420.32358: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32560: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 23826 1726867420.32573: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32605: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32659: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 23826 1726867420.32680: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32709: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 23826 1726867420.32785: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32855: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.32935: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 23826 1726867420.32996: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.33034: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.33125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 23826 1726867420.33213: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 23826 1726867420.33234: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.34205: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 23826 1726867420.34330: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3397680> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c33946b0> <<< 23826 1726867420.34448: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3395b50> <<< 23826 1726867420.34792: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "23", "second": "40", "epoch": "1726867420", "epoch_int": "1726867420", "date": "2024-09-20", "time": "17:23:40", "iso8601_micro": "2024-09-20T21:23:40.337142Z", "iso8601": "2024-09-20T21:23:40Z", "iso8601_basic": "20240920T172340337142", "iso8601_basic_short": "20240920T172340", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 23826 1726867420.35340: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy <<< 23826 1726867420.35357: stdout chunk (state=3): >>># cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select <<< 23826 1726867420.35383: stdout chunk (state=3): >>># cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon <<< 23826 1726867420.35408: stdout chunk (state=3): >>># cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 23826 1726867420.35446: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info <<< 23826 1726867420.35469: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl<<< 23826 1726867420.35493: stdout chunk (state=3): >>> # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos <<< 23826 1726867420.35531: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 23826 1726867420.35854: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 23826 1726867420.35880: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 23826 1726867420.36065: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 23826 1726867420.36223: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 23826 1726867420.36228: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 23826 1726867420.36387: stdout chunk (state=3): >>># destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 23826 1726867420.36393: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 23826 1726867420.36616: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 23826 1726867420.36620: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser <<< 23826 1726867420.36631: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 23826 1726867420.36838: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 23826 1726867420.36842: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 23826 1726867420.37057: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 23826 1726867420.37407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867420.37601: stderr chunk (state=3): >>><<< 23826 1726867420.37604: stdout chunk (state=3): >>><<< 23826 1726867420.38096: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c45104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c44dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4512a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42c1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42ffe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42fff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4337890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4337f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4317b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4315280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42fd040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4357800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4356420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4316150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4354c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42fc2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c438cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438cbf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c438cfe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c42fade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438d6d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438d3a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438e5d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c43a47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c43a5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c43a6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c43a7380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c43a62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c43a7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c43a7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438e570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40afce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40d8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40d84a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40d8770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40d90a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c40d9a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40d8950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40ade80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40dae10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40d98e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c438ecc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4103170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c41274d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c41882f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c418aa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c41883e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c414d2e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3f953d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c4126300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c40dbd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f55c4126660> # zipimport: found 103 names in '/tmp/ansible_setup_payload_til5fzcn/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3fff0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3fddfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3fdd160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3ffd760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c402eab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c402e840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c402e150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c402e5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c45129c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c402f7a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c402f9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c402ff20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c392dc10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c392f830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3930200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3931370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3933e90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c41030e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3932150> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c393bdd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c393a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c393a600> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c393ab70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3932660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c393ab10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c39800e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3981ca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3981a60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3984260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3982390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3987a40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3984410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3988830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3988aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c39883e0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c39803b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3814320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3815820> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c398aab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c398be60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c398a6f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3819a30> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c381a840> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3815c70> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c381a5a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c381b9b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3826420> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3821b80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c390ec00> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c405a8d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3826180> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3989040> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b6870> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c34f8290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c34f85c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38a3620> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b7380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b4f50> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b4b90> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c34fb650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c34faf00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c34fb0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c34fa330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c34fb830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3542360> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3540380> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c38b4c80> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3543bf0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c35431d0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3586660> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3577500> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3599e50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3599a60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55c3397680> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c33946b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55c3395b50> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "23", "second": "40", "epoch": "1726867420", "epoch_int": "1726867420", "date": "2024-09-20", "time": "17:23:40", "iso8601_micro": "2024-09-20T21:23:40.337142Z", "iso8601": "2024-09-20T21:23:40Z", "iso8601_basic": "20240920T172340337142", "iso8601_basic_short": "20240920T172340", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 23826 1726867420.40617: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867420.40621: _low_level_execute_command(): starting 23826 1726867420.40625: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867419.8042023-23936-171661330374778/ > /dev/null 2>&1 && sleep 0' 23826 1726867420.40628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867420.40630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867420.40632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867420.40634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867420.40637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867420.40638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867420.40640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867420.40642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867420.40644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867420.42883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867420.42886: stderr chunk (state=3): >>><<< 23826 1726867420.42889: stdout chunk (state=3): >>><<< 23826 1726867420.42891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867420.42893: handler run complete 23826 1726867420.42897: variable 'ansible_facts' from source: unknown 23826 1726867420.43014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867420.43340: variable 'ansible_facts' from source: unknown 23826 1726867420.43343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867420.43582: attempt loop complete, returning result 23826 1726867420.43585: _execute() done 23826 1726867420.43588: dumping result to json 23826 1726867420.43590: done dumping result, returning 23826 1726867420.43592: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-a92d-a3ea-0000000000b4] 23826 1726867420.43594: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000b4 23826 1726867420.43748: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000b4 23826 1726867420.43751: WORKER PROCESS EXITING ok: [managed_node2] 23826 1726867420.43853: no more pending results, returning what we have 23826 1726867420.43856: results queue empty 23826 1726867420.43857: checking for any_errors_fatal 23826 1726867420.43858: done checking for any_errors_fatal 23826 1726867420.43859: checking for max_fail_percentage 23826 1726867420.43860: done checking for max_fail_percentage 23826 1726867420.43861: checking to see if all hosts have failed and the running result is not ok 23826 1726867420.43862: done checking to see if all hosts have failed 23826 1726867420.43863: getting the remaining hosts for this loop 23826 1726867420.43864: done getting the remaining hosts for this loop 23826 1726867420.43868: getting the next task for host managed_node2 23826 1726867420.43879: done getting next task for host managed_node2 23826 1726867420.43881: ^ task is: TASK: Check if system is ostree 23826 1726867420.43884: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867420.43888: getting variables 23826 1726867420.43889: in VariableManager get_vars() 23826 1726867420.43918: Calling all_inventory to load vars for managed_node2 23826 1726867420.43920: Calling groups_inventory to load vars for managed_node2 23826 1726867420.43924: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867420.43936: Calling all_plugins_play to load vars for managed_node2 23826 1726867420.43939: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867420.43942: Calling groups_plugins_play to load vars for managed_node2 23826 1726867420.44736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867420.45293: done with get_vars() 23826 1726867420.45302: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:23:40 -0400 (0:00:00.740) 0:00:02.468 ****** 23826 1726867420.45731: entering _queue_task() for managed_node2/stat 23826 1726867420.46345: worker is 1 (out of 1 available) 23826 1726867420.46357: exiting _queue_task() for managed_node2/stat 23826 1726867420.46368: done queuing things up, now waiting for results queue to drain 23826 1726867420.46369: waiting for pending results... 23826 1726867420.47122: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 23826 1726867420.47338: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000b6 23826 1726867420.47379: variable 'ansible_search_path' from source: unknown 23826 1726867420.47383: variable 'ansible_search_path' from source: unknown 23826 1726867420.47806: calling self._execute() 23826 1726867420.47810: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867420.47813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867420.47816: variable 'omit' from source: magic vars 23826 1726867420.49066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867420.49837: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867420.50110: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867420.50252: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867420.50439: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867420.50536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867420.50684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867420.50688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867420.50691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867420.51090: Evaluated conditional (not __network_is_ostree is defined): True 23826 1726867420.51162: variable 'omit' from source: magic vars 23826 1726867420.51168: variable 'omit' from source: magic vars 23826 1726867420.51219: variable 'omit' from source: magic vars 23826 1726867420.51257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867420.51293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867420.51336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867420.51358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867420.51375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867420.51429: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867420.51432: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867420.51434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867420.51549: Set connection var ansible_timeout to 10 23826 1726867420.51566: Set connection var ansible_shell_executable to /bin/sh 23826 1726867420.51574: Set connection var ansible_connection to ssh 23826 1726867420.51591: Set connection var ansible_pipelining to False 23826 1726867420.51600: Set connection var ansible_shell_type to sh 23826 1726867420.51610: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867420.51757: variable 'ansible_shell_executable' from source: unknown 23826 1726867420.51761: variable 'ansible_connection' from source: unknown 23826 1726867420.51764: variable 'ansible_module_compression' from source: unknown 23826 1726867420.51766: variable 'ansible_shell_type' from source: unknown 23826 1726867420.51768: variable 'ansible_shell_executable' from source: unknown 23826 1726867420.51785: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867420.51788: variable 'ansible_pipelining' from source: unknown 23826 1726867420.51790: variable 'ansible_timeout' from source: unknown 23826 1726867420.51792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867420.51887: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867420.51900: variable 'omit' from source: magic vars 23826 1726867420.51909: starting attempt loop 23826 1726867420.51914: running the handler 23826 1726867420.51932: _low_level_execute_command(): starting 23826 1726867420.51946: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867420.52900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867420.52956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867420.53221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867420.53472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867420.55630: stdout chunk (state=3): >>>/root <<< 23826 1726867420.55790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867420.55835: stderr chunk (state=3): >>><<< 23826 1726867420.55851: stdout chunk (state=3): >>><<< 23826 1726867420.55875: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867420.55922: _low_level_execute_command(): starting 23826 1726867420.55925: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142 `" && echo ansible-tmp-1726867420.558887-23982-94731081247142="` echo /root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142 `" ) && sleep 0' 23826 1726867420.56805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867420.56869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867420.56922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867420.56944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867420.57299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867420.59785: stdout chunk (state=3): >>>ansible-tmp-1726867420.558887-23982-94731081247142=/root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142 <<< 23826 1726867420.59974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867420.60089: stdout chunk (state=3): >>><<< 23826 1726867420.60092: stderr chunk (state=3): >>><<< 23826 1726867420.60095: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867420.558887-23982-94731081247142=/root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867420.60098: variable 'ansible_module_compression' from source: unknown 23826 1726867420.60160: ANSIBALLZ: Using lock for stat 23826 1726867420.60169: ANSIBALLZ: Acquiring lock 23826 1726867420.60180: ANSIBALLZ: Lock acquired: 139851310994816 23826 1726867420.60189: ANSIBALLZ: Creating module 23826 1726867420.74800: ANSIBALLZ: Writing module into payload 23826 1726867420.74900: ANSIBALLZ: Writing module 23826 1726867420.74984: ANSIBALLZ: Renaming module 23826 1726867420.74988: ANSIBALLZ: Done creating module 23826 1726867420.74991: variable 'ansible_facts' from source: unknown 23826 1726867420.75037: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/AnsiballZ_stat.py 23826 1726867420.75252: Sending initial data 23826 1726867420.75263: Sent initial data (151 bytes) 23826 1726867420.76201: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867420.76431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867420.76435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867420.76438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867420.76488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867420.78383: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867420.78436: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867420.78492: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpt47u7a5u /root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/AnsiballZ_stat.py <<< 23826 1726867420.78505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/AnsiballZ_stat.py" <<< 23826 1726867420.78568: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 23826 1726867420.78581: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpt47u7a5u" to remote "/root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/AnsiballZ_stat.py" <<< 23826 1726867420.78590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/AnsiballZ_stat.py" <<< 23826 1726867420.79462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867420.79466: stdout chunk (state=3): >>><<< 23826 1726867420.79474: stderr chunk (state=3): >>><<< 23826 1726867420.79502: done transferring module to remote 23826 1726867420.79515: _low_level_execute_command(): starting 23826 1726867420.79520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/ /root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/AnsiballZ_stat.py && sleep 0' 23826 1726867420.80687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867420.80690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867420.80767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867420.80771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867420.80773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867420.82447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867420.82451: stdout chunk (state=3): >>><<< 23826 1726867420.82458: stderr chunk (state=3): >>><<< 23826 1726867420.82800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867420.82804: _low_level_execute_command(): starting 23826 1726867420.82807: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/AnsiballZ_stat.py && sleep 0' 23826 1726867420.83840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867420.83930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867420.83934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867420.83997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867420.84055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867420.87122: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 23826 1726867420.87202: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 23826 1726867420.87283: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 23826 1726867420.87549: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12adfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12b12a50> import '_signal' # <<< 23826 1726867420.87552: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 23826 1726867420.87590: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 23826 1726867420.87675: stdout chunk (state=3): >>>import '_collections_abc' # <<< 23826 1726867420.87703: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 23826 1726867420.87735: stdout chunk (state=3): >>>import 'os' # <<< 23826 1726867420.87784: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 23826 1726867420.87808: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 23826 1726867420.87857: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c128e5130> <<< 23826 1726867420.87962: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c128e5fa0> import 'site' # <<< 23826 1726867420.87995: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 23826 1726867420.88225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 23826 1726867420.88255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 23826 1726867420.88303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 23826 1726867420.88485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12923ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 23826 1726867420.88515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12923f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 23826 1726867420.88864: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1295b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1295bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1293bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129392b0> <<< 23826 1726867420.89029: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12921070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1297b7d0> <<< 23826 1726867420.89036: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1297a3f0> <<< 23826 1726867420.89067: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1293a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12978bc0> <<< 23826 1726867420.89136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129202f0> <<< 23826 1726867420.89241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 23826 1726867420.89284: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129b0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129b0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1291ee10> <<< 23826 1726867420.89413: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 23826 1726867420.89693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129c8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129c9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 23826 1726867420.89721: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129cacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129ca210> <<< 23826 1726867420.89744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 23826 1726867420.89787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 23826 1726867420.89819: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129cbd70> <<< 23826 1726867420.89888: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129cb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b24b0> <<< 23826 1726867420.89906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 23826 1726867420.89940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 23826 1726867420.89981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 23826 1726867420.90122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 23826 1726867420.90126: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1275fc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c127887a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12788500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c127887d0> <<< 23826 1726867420.90201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 23826 1726867420.90325: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.90511: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12789100> <<< 23826 1726867420.90666: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12789af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127889b0> <<< 23826 1726867420.90690: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1275ddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 23826 1726867420.90745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 23826 1726867420.90774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1278af00> <<< 23826 1726867420.90799: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12789c40> <<< 23826 1726867420.91035: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 23826 1726867420.91039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 23826 1726867420.91067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 23826 1726867420.91098: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127af230> <<< 23826 1726867420.91172: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 23826 1726867420.91255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867420.91390: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127d7620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 23826 1726867420.91393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 23826 1726867420.91474: stdout chunk (state=3): >>>import 'ntpath' # <<< 23826 1726867420.91509: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12838380> <<< 23826 1726867420.91731: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 23826 1726867420.91758: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1283aae0> <<< 23826 1726867420.91898: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c128384a0> <<< 23826 1726867420.91992: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127f93a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12129430> <<< 23826 1726867420.92058: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127d6420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1278be00> <<< 23826 1726867420.92196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9c127d6780> <<< 23826 1726867420.92435: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_x7bsn6s8/ansible_stat_payload.zip' <<< 23826 1726867420.92451: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.92783: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.92820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 23826 1726867420.92867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 23826 1726867420.92888: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 23826 1726867420.92924: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1217f1d0> import '_typing' # <<< 23826 1726867420.93209: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1215e0c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1215d220> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 23826 1726867420.93212: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.94605: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.95744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1217d070> <<< 23826 1726867420.95773: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 23826 1726867420.96051: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 23826 1726867420.96096: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c121a6b10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121a68a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121a61b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121a6600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1217fbf0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c121a78c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c121a7b00> <<< 23826 1726867420.96109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 23826 1726867420.96167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 23826 1726867420.96217: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121a7f80> import 'pwd' # <<< 23826 1726867420.96245: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 23826 1726867420.96388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12011d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c120139b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 23826 1726867420.96416: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120143b0> <<< 23826 1726867420.96437: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 23826 1726867420.96470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 23826 1726867420.96502: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12015550> <<< 23826 1726867420.96740: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 23826 1726867420.96759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12017fb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1201c2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12016270> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 23826 1726867420.96787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 23826 1726867420.96832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 23826 1726867420.96891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1201ff20> import '_tokenize' # <<< 23826 1726867420.96931: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1201e9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1201e780> <<< 23826 1726867420.96958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 23826 1726867420.97046: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1201ecc0> <<< 23826 1726867420.97368: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12016780> <<< 23826 1726867420.97372: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12067fb0> <<< 23826 1726867420.97378: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12069dc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12069b80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 23826 1726867420.97480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 23826 1726867420.97483: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1206c320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1206a4b0> <<< 23826 1726867420.97514: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 23826 1726867420.97538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 23826 1726867420.97586: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1206fb00> <<< 23826 1726867420.97956: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1206c4d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c120708f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12070b30> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12070dd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120684d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 23826 1726867420.97992: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c120fc3b0> <<< 23826 1726867420.98245: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c120fd7f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12072b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12073f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120727b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 23826 1726867420.98466: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.98473: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.98476: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 23826 1726867420.98512: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 23826 1726867420.98610: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.98910: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867420.99332: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.00047: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c11f01940> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f026f0> <<< 23826 1726867421.00114: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120fd910> import 'ansible.module_utils.compat.selinux' # <<< 23826 1726867421.00134: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.00162: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 23826 1726867421.00179: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.00319: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.00895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f02690> # zipimport: zlib available <<< 23826 1726867421.00967: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.01421: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.01671: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 23826 1726867421.01726: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.01812: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 23826 1726867421.01862: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 23826 1726867421.01899: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.02109: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 23826 1726867421.02180: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.02553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f038c0> <<< 23826 1726867421.02643: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 23826 1726867421.02795: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available <<< 23826 1726867421.02834: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 23826 1726867421.03021: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.03037: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.03064: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 23826 1726867421.03221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c11f0e3f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f09a60> <<< 23826 1726867421.03254: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 23826 1726867421.03292: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.03327: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.03580: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.03583: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.03625: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 23826 1726867421.03632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 23826 1726867421.03654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 23826 1726867421.03690: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121fab70> <<< 23826 1726867421.03762: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121ee840> <<< 23826 1726867421.03844: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f0e150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120736b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 23826 1726867421.04002: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 23826 1726867421.04016: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 23826 1726867421.04121: stdout chunk (state=3): >>># zipimport: zlib available <<< 23826 1726867421.04879: stdout chunk (state=3): >>># zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 23826 1726867421.04885: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback<<< 23826 1726867421.04890: stdout chunk (state=3): >>> # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs<<< 23826 1726867421.04893: stdout chunk (state=3): >>> # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site<<< 23826 1726867421.05050: stdout chunk (state=3): >>> # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse <<< 23826 1726867421.05060: stdout chunk (state=3): >>># destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess <<< 23826 1726867421.05112: stdout chunk (state=3): >>># cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 23826 1726867421.05418: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 23826 1726867421.05601: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 23826 1726867421.05605: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 23826 1726867421.05984: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 23826 1726867421.05988: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 23826 1726867421.05990: stdout chunk (state=3): >>># cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 23826 1726867421.06021: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 23826 1726867421.06084: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 <<< 23826 1726867421.06116: stdout chunk (state=3): >>># destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 23826 1726867421.06176: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 23826 1726867421.06345: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 23826 1726867421.06785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867421.06808: stdout chunk (state=3): >>><<< 23826 1726867421.06812: stderr chunk (state=3): >>><<< 23826 1726867421.07290: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12adfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12b12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c128e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c128e5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12923ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12923f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1295b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1295bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1293bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129392b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12921070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1297b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1297a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1293a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12978bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129b0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129b0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1291ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129c8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129c9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129cacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129ca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c129cbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129cb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1275fc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c127887a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12788500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c127887d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12789100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12789af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127889b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1275ddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1278af00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12789c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c129b2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127af230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127d7620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12838380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1283aae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c128384a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127f93a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12129430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c127d6420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1278be00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9c127d6780> # zipimport: found 30 names in '/tmp/ansible_stat_payload_x7bsn6s8/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1217f1d0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1215e0c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1215d220> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1217d070> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c121a6b10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121a68a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121a61b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121a6600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1217fbf0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c121a78c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c121a7b00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121a7f80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12011d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c120139b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120143b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12015550> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12017fb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1201c2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12016270> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1201ff20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1201e9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1201e780> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1201ecc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12016780> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12067fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12069dc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12069b80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1206c320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1206a4b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1206fb00> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1206c4d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c120708f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12070b30> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12070dd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120684d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c120fc3b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c120fd7f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c12072b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c12073f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120727b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c11f01940> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f026f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120fd910> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f02690> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f038c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c11f0e3f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f09a60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121fab70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c121ee840> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c11f0e150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c120736b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 23826 1726867421.09017: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867421.09020: _low_level_execute_command(): starting 23826 1726867421.09022: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867420.558887-23982-94731081247142/ > /dev/null 2>&1 && sleep 0' 23826 1726867421.09532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867421.09535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867421.09585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867421.09715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867421.09800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867421.11749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867421.11752: stdout chunk (state=3): >>><<< 23826 1726867421.11759: stderr chunk (state=3): >>><<< 23826 1726867421.11775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867421.11806: handler run complete 23826 1726867421.11829: attempt loop complete, returning result 23826 1726867421.11832: _execute() done 23826 1726867421.11835: dumping result to json 23826 1726867421.11837: done dumping result, returning 23826 1726867421.11846: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0affcac9-a3a5-a92d-a3ea-0000000000b6] 23826 1726867421.11849: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000b6 23826 1726867421.12267: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000b6 23826 1726867421.12271: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 23826 1726867421.12334: no more pending results, returning what we have 23826 1726867421.12337: results queue empty 23826 1726867421.12338: checking for any_errors_fatal 23826 1726867421.12344: done checking for any_errors_fatal 23826 1726867421.12345: checking for max_fail_percentage 23826 1726867421.12346: done checking for max_fail_percentage 23826 1726867421.12347: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.12347: done checking to see if all hosts have failed 23826 1726867421.12348: getting the remaining hosts for this loop 23826 1726867421.12350: done getting the remaining hosts for this loop 23826 1726867421.12353: getting the next task for host managed_node2 23826 1726867421.12359: done getting next task for host managed_node2 23826 1726867421.12364: ^ task is: TASK: Set flag to indicate system is ostree 23826 1726867421.12366: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.12370: getting variables 23826 1726867421.12371: in VariableManager get_vars() 23826 1726867421.12400: Calling all_inventory to load vars for managed_node2 23826 1726867421.12403: Calling groups_inventory to load vars for managed_node2 23826 1726867421.12406: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.12418: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.12420: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.12423: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.12680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.13547: done with get_vars() 23826 1726867421.13557: done getting variables 23826 1726867421.13655: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:23:41 -0400 (0:00:00.681) 0:00:03.150 ****** 23826 1726867421.13914: entering _queue_task() for managed_node2/set_fact 23826 1726867421.13915: Creating lock for set_fact 23826 1726867421.14774: worker is 1 (out of 1 available) 23826 1726867421.14785: exiting _queue_task() for managed_node2/set_fact 23826 1726867421.14794: done queuing things up, now waiting for results queue to drain 23826 1726867421.14795: waiting for pending results... 23826 1726867421.15401: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 23826 1726867421.15406: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000b7 23826 1726867421.15409: variable 'ansible_search_path' from source: unknown 23826 1726867421.15411: variable 'ansible_search_path' from source: unknown 23826 1726867421.15414: calling self._execute() 23826 1726867421.15416: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.15419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.15422: variable 'omit' from source: magic vars 23826 1726867421.16610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867421.17067: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867421.17324: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867421.17327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867421.17432: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867421.17759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867421.17763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867421.17766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867421.17768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867421.17984: Evaluated conditional (not __network_is_ostree is defined): True 23826 1726867421.18394: variable 'omit' from source: magic vars 23826 1726867421.18445: variable 'omit' from source: magic vars 23826 1726867421.18887: variable '__ostree_booted_stat' from source: set_fact 23826 1726867421.19010: variable 'omit' from source: magic vars 23826 1726867421.19153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867421.19188: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867421.19214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867421.19584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867421.19587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867421.19590: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867421.19592: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.19595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.19739: Set connection var ansible_timeout to 10 23826 1726867421.20016: Set connection var ansible_shell_executable to /bin/sh 23826 1726867421.20019: Set connection var ansible_connection to ssh 23826 1726867421.20021: Set connection var ansible_pipelining to False 23826 1726867421.20022: Set connection var ansible_shell_type to sh 23826 1726867421.20024: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867421.20047: variable 'ansible_shell_executable' from source: unknown 23826 1726867421.20183: variable 'ansible_connection' from source: unknown 23826 1726867421.20186: variable 'ansible_module_compression' from source: unknown 23826 1726867421.20188: variable 'ansible_shell_type' from source: unknown 23826 1726867421.20190: variable 'ansible_shell_executable' from source: unknown 23826 1726867421.20191: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.20193: variable 'ansible_pipelining' from source: unknown 23826 1726867421.20194: variable 'ansible_timeout' from source: unknown 23826 1726867421.20196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.20474: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867421.20494: variable 'omit' from source: magic vars 23826 1726867421.20505: starting attempt loop 23826 1726867421.20517: running the handler 23826 1726867421.20573: handler run complete 23826 1726867421.20782: attempt loop complete, returning result 23826 1726867421.20786: _execute() done 23826 1726867421.20788: dumping result to json 23826 1726867421.20790: done dumping result, returning 23826 1726867421.20793: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-a92d-a3ea-0000000000b7] 23826 1726867421.20795: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000b7 23826 1726867421.20869: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000b7 23826 1726867421.20873: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 23826 1726867421.20932: no more pending results, returning what we have 23826 1726867421.20935: results queue empty 23826 1726867421.20936: checking for any_errors_fatal 23826 1726867421.20942: done checking for any_errors_fatal 23826 1726867421.20943: checking for max_fail_percentage 23826 1726867421.20945: done checking for max_fail_percentage 23826 1726867421.20946: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.20947: done checking to see if all hosts have failed 23826 1726867421.20948: getting the remaining hosts for this loop 23826 1726867421.20949: done getting the remaining hosts for this loop 23826 1726867421.20953: getting the next task for host managed_node2 23826 1726867421.20963: done getting next task for host managed_node2 23826 1726867421.20966: ^ task is: TASK: Fix CentOS6 Base repo 23826 1726867421.20969: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.20972: getting variables 23826 1726867421.20974: in VariableManager get_vars() 23826 1726867421.21008: Calling all_inventory to load vars for managed_node2 23826 1726867421.21011: Calling groups_inventory to load vars for managed_node2 23826 1726867421.21014: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.21024: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.21026: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.21033: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.21668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.22519: done with get_vars() 23826 1726867421.22529: done getting variables 23826 1726867421.22641: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:23:41 -0400 (0:00:00.087) 0:00:03.238 ****** 23826 1726867421.22668: entering _queue_task() for managed_node2/copy 23826 1726867421.23522: worker is 1 (out of 1 available) 23826 1726867421.23532: exiting _queue_task() for managed_node2/copy 23826 1726867421.23541: done queuing things up, now waiting for results queue to drain 23826 1726867421.23543: waiting for pending results... 23826 1726867421.23872: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 23826 1726867421.24039: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000b9 23826 1726867421.24092: variable 'ansible_search_path' from source: unknown 23826 1726867421.24318: variable 'ansible_search_path' from source: unknown 23826 1726867421.24322: calling self._execute() 23826 1726867421.24415: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.24428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.24457: variable 'omit' from source: magic vars 23826 1726867421.25546: variable 'ansible_distribution' from source: facts 23826 1726867421.25628: Evaluated conditional (ansible_distribution == 'CentOS'): True 23826 1726867421.26184: variable 'ansible_distribution_major_version' from source: facts 23826 1726867421.26187: Evaluated conditional (ansible_distribution_major_version == '6'): False 23826 1726867421.26189: when evaluation is False, skipping this task 23826 1726867421.26192: _execute() done 23826 1726867421.26194: dumping result to json 23826 1726867421.26197: done dumping result, returning 23826 1726867421.26199: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-a92d-a3ea-0000000000b9] 23826 1726867421.26201: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000b9 23826 1726867421.26273: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000b9 23826 1726867421.26279: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 23826 1726867421.26347: no more pending results, returning what we have 23826 1726867421.26350: results queue empty 23826 1726867421.26351: checking for any_errors_fatal 23826 1726867421.26356: done checking for any_errors_fatal 23826 1726867421.26356: checking for max_fail_percentage 23826 1726867421.26358: done checking for max_fail_percentage 23826 1726867421.26359: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.26360: done checking to see if all hosts have failed 23826 1726867421.26360: getting the remaining hosts for this loop 23826 1726867421.26362: done getting the remaining hosts for this loop 23826 1726867421.26365: getting the next task for host managed_node2 23826 1726867421.26372: done getting next task for host managed_node2 23826 1726867421.26374: ^ task is: TASK: Include the task 'enable_epel.yml' 23826 1726867421.26379: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.26383: getting variables 23826 1726867421.26384: in VariableManager get_vars() 23826 1726867421.26413: Calling all_inventory to load vars for managed_node2 23826 1726867421.26416: Calling groups_inventory to load vars for managed_node2 23826 1726867421.26420: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.26432: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.26434: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.26437: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.27414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.27629: done with get_vars() 23826 1726867421.27638: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:23:41 -0400 (0:00:00.051) 0:00:03.289 ****** 23826 1726867421.27839: entering _queue_task() for managed_node2/include_tasks 23826 1726867421.28344: worker is 1 (out of 1 available) 23826 1726867421.28354: exiting _queue_task() for managed_node2/include_tasks 23826 1726867421.28364: done queuing things up, now waiting for results queue to drain 23826 1726867421.28365: waiting for pending results... 23826 1726867421.29175: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 23826 1726867421.29182: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000ba 23826 1726867421.29184: variable 'ansible_search_path' from source: unknown 23826 1726867421.29187: variable 'ansible_search_path' from source: unknown 23826 1726867421.29189: calling self._execute() 23826 1726867421.29329: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.29432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.29448: variable 'omit' from source: magic vars 23826 1726867421.30436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867421.34874: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867421.34946: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867421.35020: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867421.35124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867421.35213: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867421.35473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867421.35506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867421.35570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867421.35682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867421.35761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867421.36032: variable '__network_is_ostree' from source: set_fact 23826 1726867421.36057: Evaluated conditional (not __network_is_ostree | d(false)): True 23826 1726867421.36067: _execute() done 23826 1726867421.36074: dumping result to json 23826 1726867421.36083: done dumping result, returning 23826 1726867421.36108: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-a92d-a3ea-0000000000ba] 23826 1726867421.36144: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000ba 23826 1726867421.36292: no more pending results, returning what we have 23826 1726867421.36297: in VariableManager get_vars() 23826 1726867421.36332: Calling all_inventory to load vars for managed_node2 23826 1726867421.36335: Calling groups_inventory to load vars for managed_node2 23826 1726867421.36338: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.36350: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.36353: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.36356: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.37247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.37890: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000ba 23826 1726867421.37894: WORKER PROCESS EXITING 23826 1726867421.37954: done with get_vars() 23826 1726867421.37963: variable 'ansible_search_path' from source: unknown 23826 1726867421.37964: variable 'ansible_search_path' from source: unknown 23826 1726867421.38014: we have included files to process 23826 1726867421.38016: generating all_blocks data 23826 1726867421.38018: done generating all_blocks data 23826 1726867421.38024: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 23826 1726867421.38025: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 23826 1726867421.38028: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 23826 1726867421.39476: done processing included file 23826 1726867421.39482: iterating over new_blocks loaded from include file 23826 1726867421.39483: in VariableManager get_vars() 23826 1726867421.39496: done with get_vars() 23826 1726867421.39498: filtering new block on tags 23826 1726867421.39525: done filtering new block on tags 23826 1726867421.39528: in VariableManager get_vars() 23826 1726867421.39538: done with get_vars() 23826 1726867421.39540: filtering new block on tags 23826 1726867421.39550: done filtering new block on tags 23826 1726867421.39552: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 23826 1726867421.39557: extending task lists for all hosts with included blocks 23826 1726867421.39870: done extending task lists 23826 1726867421.39871: done processing included files 23826 1726867421.39872: results queue empty 23826 1726867421.39873: checking for any_errors_fatal 23826 1726867421.39876: done checking for any_errors_fatal 23826 1726867421.39879: checking for max_fail_percentage 23826 1726867421.39880: done checking for max_fail_percentage 23826 1726867421.39881: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.39882: done checking to see if all hosts have failed 23826 1726867421.39883: getting the remaining hosts for this loop 23826 1726867421.39884: done getting the remaining hosts for this loop 23826 1726867421.39985: getting the next task for host managed_node2 23826 1726867421.39990: done getting next task for host managed_node2 23826 1726867421.39992: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 23826 1726867421.40081: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.40084: getting variables 23826 1726867421.40085: in VariableManager get_vars() 23826 1726867421.40094: Calling all_inventory to load vars for managed_node2 23826 1726867421.40096: Calling groups_inventory to load vars for managed_node2 23826 1726867421.40099: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.40112: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.40119: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.40123: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.40444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.40756: done with get_vars() 23826 1726867421.40764: done getting variables 23826 1726867421.40832: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 23826 1726867421.41097: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:23:41 -0400 (0:00:00.133) 0:00:03.422 ****** 23826 1726867421.41147: entering _queue_task() for managed_node2/command 23826 1726867421.41148: Creating lock for command 23826 1726867421.41573: worker is 1 (out of 1 available) 23826 1726867421.41586: exiting _queue_task() for managed_node2/command 23826 1726867421.41594: done queuing things up, now waiting for results queue to drain 23826 1726867421.41596: waiting for pending results... 23826 1726867421.41776: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 23826 1726867421.41891: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000d4 23826 1726867421.41913: variable 'ansible_search_path' from source: unknown 23826 1726867421.41920: variable 'ansible_search_path' from source: unknown 23826 1726867421.41962: calling self._execute() 23826 1726867421.42052: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.42067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.42086: variable 'omit' from source: magic vars 23826 1726867421.42488: variable 'ansible_distribution' from source: facts 23826 1726867421.42505: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 23826 1726867421.42646: variable 'ansible_distribution_major_version' from source: facts 23826 1726867421.42658: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 23826 1726867421.42666: when evaluation is False, skipping this task 23826 1726867421.42673: _execute() done 23826 1726867421.42683: dumping result to json 23826 1726867421.42694: done dumping result, returning 23826 1726867421.42704: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [0affcac9-a3a5-a92d-a3ea-0000000000d4] 23826 1726867421.42716: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d4 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 23826 1726867421.42956: no more pending results, returning what we have 23826 1726867421.42960: results queue empty 23826 1726867421.42961: checking for any_errors_fatal 23826 1726867421.42962: done checking for any_errors_fatal 23826 1726867421.42963: checking for max_fail_percentage 23826 1726867421.42965: done checking for max_fail_percentage 23826 1726867421.42965: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.42966: done checking to see if all hosts have failed 23826 1726867421.42967: getting the remaining hosts for this loop 23826 1726867421.42968: done getting the remaining hosts for this loop 23826 1726867421.42972: getting the next task for host managed_node2 23826 1726867421.42982: done getting next task for host managed_node2 23826 1726867421.42984: ^ task is: TASK: Install yum-utils package 23826 1726867421.42989: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.42993: getting variables 23826 1726867421.42994: in VariableManager get_vars() 23826 1726867421.43138: Calling all_inventory to load vars for managed_node2 23826 1726867421.43141: Calling groups_inventory to load vars for managed_node2 23826 1726867421.43144: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.43158: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.43160: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.43163: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.43723: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d4 23826 1726867421.43726: WORKER PROCESS EXITING 23826 1726867421.43748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.44198: done with get_vars() 23826 1726867421.44210: done getting variables 23826 1726867421.44425: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:23:41 -0400 (0:00:00.033) 0:00:03.455 ****** 23826 1726867421.44457: entering _queue_task() for managed_node2/package 23826 1726867421.44459: Creating lock for package 23826 1726867421.44898: worker is 1 (out of 1 available) 23826 1726867421.44913: exiting _queue_task() for managed_node2/package 23826 1726867421.45035: done queuing things up, now waiting for results queue to drain 23826 1726867421.45037: waiting for pending results... 23826 1726867421.45184: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 23826 1726867421.45305: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000d5 23826 1726867421.45328: variable 'ansible_search_path' from source: unknown 23826 1726867421.45336: variable 'ansible_search_path' from source: unknown 23826 1726867421.45383: calling self._execute() 23826 1726867421.45472: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.45486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.45500: variable 'omit' from source: magic vars 23826 1726867421.45905: variable 'ansible_distribution' from source: facts 23826 1726867421.45926: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 23826 1726867421.46067: variable 'ansible_distribution_major_version' from source: facts 23826 1726867421.46081: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 23826 1726867421.46090: when evaluation is False, skipping this task 23826 1726867421.46098: _execute() done 23826 1726867421.46105: dumping result to json 23826 1726867421.46121: done dumping result, returning 23826 1726867421.46135: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0affcac9-a3a5-a92d-a3ea-0000000000d5] 23826 1726867421.46182: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d5 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 23826 1726867421.46484: no more pending results, returning what we have 23826 1726867421.46488: results queue empty 23826 1726867421.46489: checking for any_errors_fatal 23826 1726867421.46492: done checking for any_errors_fatal 23826 1726867421.46493: checking for max_fail_percentage 23826 1726867421.46494: done checking for max_fail_percentage 23826 1726867421.46495: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.46496: done checking to see if all hosts have failed 23826 1726867421.46497: getting the remaining hosts for this loop 23826 1726867421.46498: done getting the remaining hosts for this loop 23826 1726867421.46501: getting the next task for host managed_node2 23826 1726867421.46509: done getting next task for host managed_node2 23826 1726867421.46511: ^ task is: TASK: Enable EPEL 7 23826 1726867421.46515: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.46519: getting variables 23826 1726867421.46520: in VariableManager get_vars() 23826 1726867421.46547: Calling all_inventory to load vars for managed_node2 23826 1726867421.46550: Calling groups_inventory to load vars for managed_node2 23826 1726867421.46553: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.46561: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.46564: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.46566: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.46721: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d5 23826 1726867421.46724: WORKER PROCESS EXITING 23826 1726867421.46746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.46946: done with get_vars() 23826 1726867421.46956: done getting variables 23826 1726867421.47017: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:23:41 -0400 (0:00:00.025) 0:00:03.481 ****** 23826 1726867421.47045: entering _queue_task() for managed_node2/command 23826 1726867421.47275: worker is 1 (out of 1 available) 23826 1726867421.47340: exiting _queue_task() for managed_node2/command 23826 1726867421.47350: done queuing things up, now waiting for results queue to drain 23826 1726867421.47351: waiting for pending results... 23826 1726867421.47542: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 23826 1726867421.47646: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000d6 23826 1726867421.47672: variable 'ansible_search_path' from source: unknown 23826 1726867421.47682: variable 'ansible_search_path' from source: unknown 23826 1726867421.47722: calling self._execute() 23826 1726867421.47799: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.47811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.47823: variable 'omit' from source: magic vars 23826 1726867421.48475: variable 'ansible_distribution' from source: facts 23826 1726867421.48682: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 23826 1726867421.48937: variable 'ansible_distribution_major_version' from source: facts 23826 1726867421.48940: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 23826 1726867421.48942: when evaluation is False, skipping this task 23826 1726867421.48944: _execute() done 23826 1726867421.48946: dumping result to json 23826 1726867421.48948: done dumping result, returning 23826 1726867421.48951: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0affcac9-a3a5-a92d-a3ea-0000000000d6] 23826 1726867421.48953: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d6 23826 1726867421.49022: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d6 23826 1726867421.49025: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 23826 1726867421.49088: no more pending results, returning what we have 23826 1726867421.49091: results queue empty 23826 1726867421.49092: checking for any_errors_fatal 23826 1726867421.49097: done checking for any_errors_fatal 23826 1726867421.49098: checking for max_fail_percentage 23826 1726867421.49100: done checking for max_fail_percentage 23826 1726867421.49100: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.49101: done checking to see if all hosts have failed 23826 1726867421.49102: getting the remaining hosts for this loop 23826 1726867421.49104: done getting the remaining hosts for this loop 23826 1726867421.49109: getting the next task for host managed_node2 23826 1726867421.49117: done getting next task for host managed_node2 23826 1726867421.49119: ^ task is: TASK: Enable EPEL 8 23826 1726867421.49124: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.49128: getting variables 23826 1726867421.49129: in VariableManager get_vars() 23826 1726867421.49158: Calling all_inventory to load vars for managed_node2 23826 1726867421.49161: Calling groups_inventory to load vars for managed_node2 23826 1726867421.49165: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.49179: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.49182: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.49185: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.49896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.50242: done with get_vars() 23826 1726867421.50251: done getting variables 23826 1726867421.50310: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:23:41 -0400 (0:00:00.032) 0:00:03.514 ****** 23826 1726867421.50338: entering _queue_task() for managed_node2/command 23826 1726867421.50579: worker is 1 (out of 1 available) 23826 1726867421.50705: exiting _queue_task() for managed_node2/command 23826 1726867421.50717: done queuing things up, now waiting for results queue to drain 23826 1726867421.50718: waiting for pending results... 23826 1726867421.50867: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 23826 1726867421.50990: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000d7 23826 1726867421.51012: variable 'ansible_search_path' from source: unknown 23826 1726867421.51025: variable 'ansible_search_path' from source: unknown 23826 1726867421.51068: calling self._execute() 23826 1726867421.51159: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.51163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.51246: variable 'omit' from source: magic vars 23826 1726867421.51583: variable 'ansible_distribution' from source: facts 23826 1726867421.51604: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 23826 1726867421.51744: variable 'ansible_distribution_major_version' from source: facts 23826 1726867421.51755: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 23826 1726867421.51764: when evaluation is False, skipping this task 23826 1726867421.51772: _execute() done 23826 1726867421.51782: dumping result to json 23826 1726867421.51795: done dumping result, returning 23826 1726867421.51812: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0affcac9-a3a5-a92d-a3ea-0000000000d7] 23826 1726867421.51822: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d7 23826 1726867421.51964: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d7 23826 1726867421.51967: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 23826 1726867421.52024: no more pending results, returning what we have 23826 1726867421.52028: results queue empty 23826 1726867421.52029: checking for any_errors_fatal 23826 1726867421.52033: done checking for any_errors_fatal 23826 1726867421.52034: checking for max_fail_percentage 23826 1726867421.52035: done checking for max_fail_percentage 23826 1726867421.52036: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.52037: done checking to see if all hosts have failed 23826 1726867421.52038: getting the remaining hosts for this loop 23826 1726867421.52040: done getting the remaining hosts for this loop 23826 1726867421.52044: getting the next task for host managed_node2 23826 1726867421.52055: done getting next task for host managed_node2 23826 1726867421.52057: ^ task is: TASK: Enable EPEL 6 23826 1726867421.52061: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.52065: getting variables 23826 1726867421.52067: in VariableManager get_vars() 23826 1726867421.52099: Calling all_inventory to load vars for managed_node2 23826 1726867421.52102: Calling groups_inventory to load vars for managed_node2 23826 1726867421.52106: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.52123: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.52127: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.52130: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.52468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.52667: done with get_vars() 23826 1726867421.52675: done getting variables 23826 1726867421.52740: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:23:41 -0400 (0:00:00.024) 0:00:03.539 ****** 23826 1726867421.52767: entering _queue_task() for managed_node2/copy 23826 1726867421.52997: worker is 1 (out of 1 available) 23826 1726867421.53012: exiting _queue_task() for managed_node2/copy 23826 1726867421.53023: done queuing things up, now waiting for results queue to drain 23826 1726867421.53024: waiting for pending results... 23826 1726867421.53374: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 23826 1726867421.53381: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000d9 23826 1726867421.53395: variable 'ansible_search_path' from source: unknown 23826 1726867421.53403: variable 'ansible_search_path' from source: unknown 23826 1726867421.53444: calling self._execute() 23826 1726867421.53529: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.53543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.53558: variable 'omit' from source: magic vars 23826 1726867421.53946: variable 'ansible_distribution' from source: facts 23826 1726867421.53963: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 23826 1726867421.54089: variable 'ansible_distribution_major_version' from source: facts 23826 1726867421.54126: Evaluated conditional (ansible_distribution_major_version == '6'): False 23826 1726867421.54129: when evaluation is False, skipping this task 23826 1726867421.54132: _execute() done 23826 1726867421.54134: dumping result to json 23826 1726867421.54137: done dumping result, returning 23826 1726867421.54139: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0affcac9-a3a5-a92d-a3ea-0000000000d9] 23826 1726867421.54235: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d9 23826 1726867421.54300: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000d9 23826 1726867421.54303: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 23826 1726867421.54352: no more pending results, returning what we have 23826 1726867421.54356: results queue empty 23826 1726867421.54357: checking for any_errors_fatal 23826 1726867421.54361: done checking for any_errors_fatal 23826 1726867421.54362: checking for max_fail_percentage 23826 1726867421.54364: done checking for max_fail_percentage 23826 1726867421.54364: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.54365: done checking to see if all hosts have failed 23826 1726867421.54366: getting the remaining hosts for this loop 23826 1726867421.54367: done getting the remaining hosts for this loop 23826 1726867421.54371: getting the next task for host managed_node2 23826 1726867421.54381: done getting next task for host managed_node2 23826 1726867421.54384: ^ task is: TASK: Set network provider to 'nm' 23826 1726867421.54386: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.54390: getting variables 23826 1726867421.54391: in VariableManager get_vars() 23826 1726867421.54420: Calling all_inventory to load vars for managed_node2 23826 1726867421.54423: Calling groups_inventory to load vars for managed_node2 23826 1726867421.54426: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.54437: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.54439: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.54442: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.54944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.55135: done with get_vars() 23826 1726867421.55143: done getting variables 23826 1726867421.55196: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:13 Friday 20 September 2024 17:23:41 -0400 (0:00:00.024) 0:00:03.563 ****** 23826 1726867421.55222: entering _queue_task() for managed_node2/set_fact 23826 1726867421.55431: worker is 1 (out of 1 available) 23826 1726867421.55441: exiting _queue_task() for managed_node2/set_fact 23826 1726867421.55562: done queuing things up, now waiting for results queue to drain 23826 1726867421.55564: waiting for pending results... 23826 1726867421.55788: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 23826 1726867421.55793: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000007 23826 1726867421.55800: variable 'ansible_search_path' from source: unknown 23826 1726867421.55839: calling self._execute() 23826 1726867421.55926: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.55937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.55950: variable 'omit' from source: magic vars 23826 1726867421.56101: variable 'omit' from source: magic vars 23826 1726867421.56104: variable 'omit' from source: magic vars 23826 1726867421.56147: variable 'omit' from source: magic vars 23826 1726867421.56194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867421.56247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867421.56272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867421.56296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867421.56341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867421.56356: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867421.56363: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.56370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.56535: Set connection var ansible_timeout to 10 23826 1726867421.56539: Set connection var ansible_shell_executable to /bin/sh 23826 1726867421.56541: Set connection var ansible_connection to ssh 23826 1726867421.56543: Set connection var ansible_pipelining to False 23826 1726867421.56545: Set connection var ansible_shell_type to sh 23826 1726867421.56547: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867421.56556: variable 'ansible_shell_executable' from source: unknown 23826 1726867421.56564: variable 'ansible_connection' from source: unknown 23826 1726867421.56571: variable 'ansible_module_compression' from source: unknown 23826 1726867421.56579: variable 'ansible_shell_type' from source: unknown 23826 1726867421.56587: variable 'ansible_shell_executable' from source: unknown 23826 1726867421.56594: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.56600: variable 'ansible_pipelining' from source: unknown 23826 1726867421.56606: variable 'ansible_timeout' from source: unknown 23826 1726867421.56616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.56862: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867421.56866: variable 'omit' from source: magic vars 23826 1726867421.56869: starting attempt loop 23826 1726867421.56871: running the handler 23826 1726867421.56873: handler run complete 23826 1726867421.56875: attempt loop complete, returning result 23826 1726867421.56880: _execute() done 23826 1726867421.56882: dumping result to json 23826 1726867421.56884: done dumping result, returning 23826 1726867421.56885: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0affcac9-a3a5-a92d-a3ea-000000000007] 23826 1726867421.56887: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000007 23826 1726867421.56948: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000007 23826 1726867421.56952: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 23826 1726867421.57012: no more pending results, returning what we have 23826 1726867421.57015: results queue empty 23826 1726867421.57016: checking for any_errors_fatal 23826 1726867421.57022: done checking for any_errors_fatal 23826 1726867421.57022: checking for max_fail_percentage 23826 1726867421.57024: done checking for max_fail_percentage 23826 1726867421.57025: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.57026: done checking to see if all hosts have failed 23826 1726867421.57027: getting the remaining hosts for this loop 23826 1726867421.57028: done getting the remaining hosts for this loop 23826 1726867421.57033: getting the next task for host managed_node2 23826 1726867421.57039: done getting next task for host managed_node2 23826 1726867421.57041: ^ task is: TASK: meta (flush_handlers) 23826 1726867421.57043: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.57047: getting variables 23826 1726867421.57049: in VariableManager get_vars() 23826 1726867421.57189: Calling all_inventory to load vars for managed_node2 23826 1726867421.57191: Calling groups_inventory to load vars for managed_node2 23826 1726867421.57195: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.57204: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.57210: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.57213: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.57480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.57679: done with get_vars() 23826 1726867421.57688: done getting variables 23826 1726867421.57757: in VariableManager get_vars() 23826 1726867421.57766: Calling all_inventory to load vars for managed_node2 23826 1726867421.57768: Calling groups_inventory to load vars for managed_node2 23826 1726867421.57770: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.57774: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.57778: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.57781: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.57955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.58258: done with get_vars() 23826 1726867421.58483: done queuing things up, now waiting for results queue to drain 23826 1726867421.58485: results queue empty 23826 1726867421.58680: checking for any_errors_fatal 23826 1726867421.58682: done checking for any_errors_fatal 23826 1726867421.58683: checking for max_fail_percentage 23826 1726867421.58684: done checking for max_fail_percentage 23826 1726867421.58685: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.58686: done checking to see if all hosts have failed 23826 1726867421.58686: getting the remaining hosts for this loop 23826 1726867421.58687: done getting the remaining hosts for this loop 23826 1726867421.58690: getting the next task for host managed_node2 23826 1726867421.58695: done getting next task for host managed_node2 23826 1726867421.58696: ^ task is: TASK: meta (flush_handlers) 23826 1726867421.58697: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.58705: getting variables 23826 1726867421.58706: in VariableManager get_vars() 23826 1726867421.58716: Calling all_inventory to load vars for managed_node2 23826 1726867421.58718: Calling groups_inventory to load vars for managed_node2 23826 1726867421.58720: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.58724: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.58727: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.58729: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.58866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.59526: done with get_vars() 23826 1726867421.59534: done getting variables 23826 1726867421.59779: in VariableManager get_vars() 23826 1726867421.59788: Calling all_inventory to load vars for managed_node2 23826 1726867421.59791: Calling groups_inventory to load vars for managed_node2 23826 1726867421.59793: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.59797: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.59799: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.59802: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.59920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.60322: done with get_vars() 23826 1726867421.60335: done queuing things up, now waiting for results queue to drain 23826 1726867421.60337: results queue empty 23826 1726867421.60337: checking for any_errors_fatal 23826 1726867421.60339: done checking for any_errors_fatal 23826 1726867421.60339: checking for max_fail_percentage 23826 1726867421.60340: done checking for max_fail_percentage 23826 1726867421.60341: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.60342: done checking to see if all hosts have failed 23826 1726867421.60343: getting the remaining hosts for this loop 23826 1726867421.60343: done getting the remaining hosts for this loop 23826 1726867421.60346: getting the next task for host managed_node2 23826 1726867421.60349: done getting next task for host managed_node2 23826 1726867421.60350: ^ task is: None 23826 1726867421.60352: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.60353: done queuing things up, now waiting for results queue to drain 23826 1726867421.60354: results queue empty 23826 1726867421.60354: checking for any_errors_fatal 23826 1726867421.60355: done checking for any_errors_fatal 23826 1726867421.60356: checking for max_fail_percentage 23826 1726867421.60357: done checking for max_fail_percentage 23826 1726867421.60357: checking to see if all hosts have failed and the running result is not ok 23826 1726867421.60358: done checking to see if all hosts have failed 23826 1726867421.60360: getting the next task for host managed_node2 23826 1726867421.60362: done getting next task for host managed_node2 23826 1726867421.60363: ^ task is: None 23826 1726867421.60364: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.60586: in VariableManager get_vars() 23826 1726867421.60610: done with get_vars() 23826 1726867421.60618: in VariableManager get_vars() 23826 1726867421.60630: done with get_vars() 23826 1726867421.60635: variable 'omit' from source: magic vars 23826 1726867421.60670: in VariableManager get_vars() 23826 1726867421.60685: done with get_vars() 23826 1726867421.60712: variable 'omit' from source: magic vars PLAY [Play for testing ipv6 disabled] ****************************************** 23826 1726867421.61195: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 23826 1726867421.61285: getting the remaining hosts for this loop 23826 1726867421.61286: done getting the remaining hosts for this loop 23826 1726867421.61289: getting the next task for host managed_node2 23826 1726867421.61292: done getting next task for host managed_node2 23826 1726867421.61294: ^ task is: TASK: Gathering Facts 23826 1726867421.61295: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867421.61297: getting variables 23826 1726867421.61298: in VariableManager get_vars() 23826 1726867421.61311: Calling all_inventory to load vars for managed_node2 23826 1726867421.61314: Calling groups_inventory to load vars for managed_node2 23826 1726867421.61316: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867421.61320: Calling all_plugins_play to load vars for managed_node2 23826 1726867421.61333: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867421.61452: Calling groups_plugins_play to load vars for managed_node2 23826 1726867421.61711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867421.62114: done with get_vars() 23826 1726867421.62123: done getting variables 23826 1726867421.62163: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Friday 20 September 2024 17:23:41 -0400 (0:00:00.069) 0:00:03.633 ****** 23826 1726867421.62188: entering _queue_task() for managed_node2/gather_facts 23826 1726867421.62843: worker is 1 (out of 1 available) 23826 1726867421.62851: exiting _queue_task() for managed_node2/gather_facts 23826 1726867421.62860: done queuing things up, now waiting for results queue to drain 23826 1726867421.62861: waiting for pending results... 23826 1726867421.63095: running TaskExecutor() for managed_node2/TASK: Gathering Facts 23826 1726867421.63100: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000000ff 23826 1726867421.63124: variable 'ansible_search_path' from source: unknown 23826 1726867421.63156: calling self._execute() 23826 1726867421.63244: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.63259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.63273: variable 'omit' from source: magic vars 23826 1726867421.63649: variable 'ansible_distribution_major_version' from source: facts 23826 1726867421.63674: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867421.63687: variable 'omit' from source: magic vars 23826 1726867421.63719: variable 'omit' from source: magic vars 23826 1726867421.63757: variable 'omit' from source: magic vars 23826 1726867421.63811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867421.63852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867421.63886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867421.63982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867421.63990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867421.63994: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867421.63996: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.63998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.64101: Set connection var ansible_timeout to 10 23826 1726867421.64105: Set connection var ansible_shell_executable to /bin/sh 23826 1726867421.64115: Set connection var ansible_connection to ssh 23826 1726867421.64127: Set connection var ansible_pipelining to False 23826 1726867421.64181: Set connection var ansible_shell_type to sh 23826 1726867421.64184: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867421.64186: variable 'ansible_shell_executable' from source: unknown 23826 1726867421.64188: variable 'ansible_connection' from source: unknown 23826 1726867421.64190: variable 'ansible_module_compression' from source: unknown 23826 1726867421.64192: variable 'ansible_shell_type' from source: unknown 23826 1726867421.64193: variable 'ansible_shell_executable' from source: unknown 23826 1726867421.64195: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867421.64197: variable 'ansible_pipelining' from source: unknown 23826 1726867421.64198: variable 'ansible_timeout' from source: unknown 23826 1726867421.64200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867421.64379: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867421.64395: variable 'omit' from source: magic vars 23826 1726867421.64405: starting attempt loop 23826 1726867421.64414: running the handler 23826 1726867421.64442: variable 'ansible_facts' from source: unknown 23826 1726867421.64535: _low_level_execute_command(): starting 23826 1726867421.64538: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867421.65310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867421.65333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867421.65356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867421.65398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867421.65448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867421.67800: stdout chunk (state=3): >>>/root <<< 23826 1726867421.67995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867421.68073: stderr chunk (state=3): >>><<< 23826 1726867421.68283: stdout chunk (state=3): >>><<< 23826 1726867421.68288: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867421.68291: _low_level_execute_command(): starting 23826 1726867421.68294: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162 `" && echo ansible-tmp-1726867421.6820943-24054-280790674902162="` echo /root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162 `" ) && sleep 0' 23826 1726867421.69269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867421.69368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867421.69399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867421.69487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867421.72219: stdout chunk (state=3): >>>ansible-tmp-1726867421.6820943-24054-280790674902162=/root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162 <<< 23826 1726867421.72404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867421.72411: stdout chunk (state=3): >>><<< 23826 1726867421.72415: stderr chunk (state=3): >>><<< 23826 1726867421.72624: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867421.6820943-24054-280790674902162=/root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867421.72627: variable 'ansible_module_compression' from source: unknown 23826 1726867421.72784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 23826 1726867421.72801: variable 'ansible_facts' from source: unknown 23826 1726867421.73266: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/AnsiballZ_setup.py 23826 1726867421.73416: Sending initial data 23826 1726867421.73418: Sent initial data (154 bytes) 23826 1726867421.74146: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867421.74203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867421.74231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867421.74256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867421.74385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867421.76773: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867421.76883: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867421.76979: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpl5y36xit /root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/AnsiballZ_setup.py <<< 23826 1726867421.76983: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/AnsiballZ_setup.py" <<< 23826 1726867421.77021: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpl5y36xit" to remote "/root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/AnsiballZ_setup.py" <<< 23826 1726867421.80213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867421.80216: stderr chunk (state=3): >>><<< 23826 1726867421.80219: stdout chunk (state=3): >>><<< 23826 1726867421.80221: done transferring module to remote 23826 1726867421.80223: _low_level_execute_command(): starting 23826 1726867421.80226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/ /root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/AnsiballZ_setup.py && sleep 0' 23826 1726867421.81453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867421.81466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867421.81770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867421.81836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867421.81901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867421.84496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867421.84526: stderr chunk (state=3): >>><<< 23826 1726867421.84552: stdout chunk (state=3): >>><<< 23826 1726867421.84593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867421.84686: _low_level_execute_command(): starting 23826 1726867421.84690: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/AnsiballZ_setup.py && sleep 0' 23826 1726867421.86092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867421.86309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867421.86494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867421.86566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867422.69670: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_loadavg": {"1m": 0.43603515625, "5m": 0.37548828125, "15m": 0.21337890625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "<<< 23826 1726867422.69716: stdout chunk (state=3): >>>ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2929, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 602, "free": 2929}, "nocache": {"free": 3268, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 660, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794111488, "block_size": 4096, "block_total": 65519099, "block_available": 63914578, "block_used": 1604521, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": <<< 23826 1726867422.69755: stdout chunk (state=3): >>>"ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "23", "second": "42", "epoch": "1726867422", "epoch_int": "1726867422", "date": "2024-09-20", "time": "17:23:42", "iso8601_micro": "2024-09-20T21:23:42.691243Z", "iso8601": "2024-09-20T21:23:42Z", "iso8601_basic": "20240920T172342691243", "iso8601_basic_short": "20240920T172342", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 23826 1726867422.72561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867422.72571: stdout chunk (state=3): >>><<< 23826 1726867422.72588: stderr chunk (state=3): >>><<< 23826 1726867422.72638: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_loadavg": {"1m": 0.43603515625, "5m": 0.37548828125, "15m": 0.21337890625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2929, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 602, "free": 2929}, "nocache": {"free": 3268, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 660, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794111488, "block_size": 4096, "block_total": 65519099, "block_available": 63914578, "block_used": 1604521, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "23", "second": "42", "epoch": "1726867422", "epoch_int": "1726867422", "date": "2024-09-20", "time": "17:23:42", "iso8601_micro": "2024-09-20T21:23:42.691243Z", "iso8601": "2024-09-20T21:23:42Z", "iso8601_basic": "20240920T172342691243", "iso8601_basic_short": "20240920T172342", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867422.72984: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867422.73013: _low_level_execute_command(): starting 23826 1726867422.73024: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867421.6820943-24054-280790674902162/ > /dev/null 2>&1 && sleep 0' 23826 1726867422.73662: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867422.73684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867422.73763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867422.73801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867422.73825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867422.73841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867422.73926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867422.76630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867422.76634: stdout chunk (state=3): >>><<< 23826 1726867422.76647: stderr chunk (state=3): >>><<< 23826 1726867422.76884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867422.76888: handler run complete 23826 1726867422.76890: variable 'ansible_facts' from source: unknown 23826 1726867422.76893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.77231: variable 'ansible_facts' from source: unknown 23826 1726867422.77323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.77491: attempt loop complete, returning result 23826 1726867422.77555: _execute() done 23826 1726867422.77558: dumping result to json 23826 1726867422.77560: done dumping result, returning 23826 1726867422.77562: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-a92d-a3ea-0000000000ff] 23826 1726867422.77569: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000ff ok: [managed_node2] 23826 1726867422.78622: no more pending results, returning what we have 23826 1726867422.78625: results queue empty 23826 1726867422.78626: checking for any_errors_fatal 23826 1726867422.78627: done checking for any_errors_fatal 23826 1726867422.78628: checking for max_fail_percentage 23826 1726867422.78634: done checking for max_fail_percentage 23826 1726867422.78635: checking to see if all hosts have failed and the running result is not ok 23826 1726867422.78636: done checking to see if all hosts have failed 23826 1726867422.78637: getting the remaining hosts for this loop 23826 1726867422.78638: done getting the remaining hosts for this loop 23826 1726867422.78641: getting the next task for host managed_node2 23826 1726867422.78646: done getting next task for host managed_node2 23826 1726867422.78648: ^ task is: TASK: meta (flush_handlers) 23826 1726867422.78650: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867422.78653: getting variables 23826 1726867422.78654: in VariableManager get_vars() 23826 1726867422.78683: Calling all_inventory to load vars for managed_node2 23826 1726867422.78686: Calling groups_inventory to load vars for managed_node2 23826 1726867422.78688: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867422.78693: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000000ff 23826 1726867422.78696: WORKER PROCESS EXITING 23826 1726867422.78704: Calling all_plugins_play to load vars for managed_node2 23826 1726867422.78709: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867422.78712: Calling groups_plugins_play to load vars for managed_node2 23826 1726867422.78886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.79091: done with get_vars() 23826 1726867422.79100: done getting variables 23826 1726867422.79161: in VariableManager get_vars() 23826 1726867422.79175: Calling all_inventory to load vars for managed_node2 23826 1726867422.79178: Calling groups_inventory to load vars for managed_node2 23826 1726867422.79181: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867422.79185: Calling all_plugins_play to load vars for managed_node2 23826 1726867422.79187: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867422.79189: Calling groups_plugins_play to load vars for managed_node2 23826 1726867422.79326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.79529: done with get_vars() 23826 1726867422.79541: done queuing things up, now waiting for results queue to drain 23826 1726867422.79543: results queue empty 23826 1726867422.79544: checking for any_errors_fatal 23826 1726867422.79547: done checking for any_errors_fatal 23826 1726867422.79548: checking for max_fail_percentage 23826 1726867422.79549: done checking for max_fail_percentage 23826 1726867422.79553: checking to see if all hosts have failed and the running result is not ok 23826 1726867422.79554: done checking to see if all hosts have failed 23826 1726867422.79554: getting the remaining hosts for this loop 23826 1726867422.79555: done getting the remaining hosts for this loop 23826 1726867422.79558: getting the next task for host managed_node2 23826 1726867422.79561: done getting next task for host managed_node2 23826 1726867422.79563: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 23826 1726867422.79564: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867422.79567: getting variables 23826 1726867422.79567: in VariableManager get_vars() 23826 1726867422.79581: Calling all_inventory to load vars for managed_node2 23826 1726867422.79583: Calling groups_inventory to load vars for managed_node2 23826 1726867422.79585: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867422.79589: Calling all_plugins_play to load vars for managed_node2 23826 1726867422.79592: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867422.79594: Calling groups_plugins_play to load vars for managed_node2 23826 1726867422.79746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.79971: done with get_vars() 23826 1726867422.79981: done getting variables 23826 1726867422.80023: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867422.80180: variable 'type' from source: play vars 23826 1726867422.80186: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:10 Friday 20 September 2024 17:23:42 -0400 (0:00:01.180) 0:00:04.813 ****** 23826 1726867422.80225: entering _queue_task() for managed_node2/set_fact 23826 1726867422.80501: worker is 1 (out of 1 available) 23826 1726867422.80517: exiting _queue_task() for managed_node2/set_fact 23826 1726867422.80528: done queuing things up, now waiting for results queue to drain 23826 1726867422.80530: waiting for pending results... 23826 1726867422.81093: running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=ethtest0 23826 1726867422.81098: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000000b 23826 1726867422.81101: variable 'ansible_search_path' from source: unknown 23826 1726867422.81104: calling self._execute() 23826 1726867422.81108: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867422.81120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867422.81133: variable 'omit' from source: magic vars 23826 1726867422.81469: variable 'ansible_distribution_major_version' from source: facts 23826 1726867422.81486: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867422.81496: variable 'omit' from source: magic vars 23826 1726867422.81517: variable 'omit' from source: magic vars 23826 1726867422.81550: variable 'type' from source: play vars 23826 1726867422.81628: variable 'type' from source: play vars 23826 1726867422.81645: variable 'interface' from source: play vars 23826 1726867422.81713: variable 'interface' from source: play vars 23826 1726867422.81735: variable 'omit' from source: magic vars 23826 1726867422.81783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867422.81826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867422.81852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867422.81875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867422.81896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867422.81931: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867422.81940: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867422.81947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867422.82048: Set connection var ansible_timeout to 10 23826 1726867422.82065: Set connection var ansible_shell_executable to /bin/sh 23826 1726867422.82073: Set connection var ansible_connection to ssh 23826 1726867422.82088: Set connection var ansible_pipelining to False 23826 1726867422.82095: Set connection var ansible_shell_type to sh 23826 1726867422.82105: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867422.82132: variable 'ansible_shell_executable' from source: unknown 23826 1726867422.82142: variable 'ansible_connection' from source: unknown 23826 1726867422.82150: variable 'ansible_module_compression' from source: unknown 23826 1726867422.82157: variable 'ansible_shell_type' from source: unknown 23826 1726867422.82164: variable 'ansible_shell_executable' from source: unknown 23826 1726867422.82171: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867422.82182: variable 'ansible_pipelining' from source: unknown 23826 1726867422.82190: variable 'ansible_timeout' from source: unknown 23826 1726867422.82198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867422.82338: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867422.82360: variable 'omit' from source: magic vars 23826 1726867422.82372: starting attempt loop 23826 1726867422.82470: running the handler 23826 1726867422.82473: handler run complete 23826 1726867422.82476: attempt loop complete, returning result 23826 1726867422.82479: _execute() done 23826 1726867422.82481: dumping result to json 23826 1726867422.82483: done dumping result, returning 23826 1726867422.82485: done running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=ethtest0 [0affcac9-a3a5-a92d-a3ea-00000000000b] 23826 1726867422.82488: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000b 23826 1726867422.82549: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000b 23826 1726867422.82553: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 23826 1726867422.82626: no more pending results, returning what we have 23826 1726867422.82629: results queue empty 23826 1726867422.82630: checking for any_errors_fatal 23826 1726867422.82633: done checking for any_errors_fatal 23826 1726867422.82633: checking for max_fail_percentage 23826 1726867422.82635: done checking for max_fail_percentage 23826 1726867422.82636: checking to see if all hosts have failed and the running result is not ok 23826 1726867422.82637: done checking to see if all hosts have failed 23826 1726867422.82638: getting the remaining hosts for this loop 23826 1726867422.82640: done getting the remaining hosts for this loop 23826 1726867422.82643: getting the next task for host managed_node2 23826 1726867422.82650: done getting next task for host managed_node2 23826 1726867422.82652: ^ task is: TASK: Include the task 'show_interfaces.yml' 23826 1726867422.82654: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867422.82658: getting variables 23826 1726867422.82659: in VariableManager get_vars() 23826 1726867422.82696: Calling all_inventory to load vars for managed_node2 23826 1726867422.82700: Calling groups_inventory to load vars for managed_node2 23826 1726867422.82702: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867422.82712: Calling all_plugins_play to load vars for managed_node2 23826 1726867422.82715: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867422.82719: Calling groups_plugins_play to load vars for managed_node2 23826 1726867422.83070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.83258: done with get_vars() 23826 1726867422.83267: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:14 Friday 20 September 2024 17:23:42 -0400 (0:00:00.031) 0:00:04.844 ****** 23826 1726867422.83349: entering _queue_task() for managed_node2/include_tasks 23826 1726867422.83556: worker is 1 (out of 1 available) 23826 1726867422.83567: exiting _queue_task() for managed_node2/include_tasks 23826 1726867422.83581: done queuing things up, now waiting for results queue to drain 23826 1726867422.83582: waiting for pending results... 23826 1726867422.83900: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 23826 1726867422.83911: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000000c 23826 1726867422.83929: variable 'ansible_search_path' from source: unknown 23826 1726867422.83965: calling self._execute() 23826 1726867422.84050: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867422.84063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867422.84075: variable 'omit' from source: magic vars 23826 1726867422.84426: variable 'ansible_distribution_major_version' from source: facts 23826 1726867422.84448: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867422.84461: _execute() done 23826 1726867422.84470: dumping result to json 23826 1726867422.84482: done dumping result, returning 23826 1726867422.84538: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-a92d-a3ea-00000000000c] 23826 1726867422.84541: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000c 23826 1726867422.84614: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000c 23826 1726867422.84618: WORKER PROCESS EXITING 23826 1726867422.84645: no more pending results, returning what we have 23826 1726867422.84650: in VariableManager get_vars() 23826 1726867422.84688: Calling all_inventory to load vars for managed_node2 23826 1726867422.84691: Calling groups_inventory to load vars for managed_node2 23826 1726867422.84693: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867422.84704: Calling all_plugins_play to load vars for managed_node2 23826 1726867422.84707: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867422.84709: Calling groups_plugins_play to load vars for managed_node2 23826 1726867422.84972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.85180: done with get_vars() 23826 1726867422.85194: variable 'ansible_search_path' from source: unknown 23826 1726867422.85211: we have included files to process 23826 1726867422.85213: generating all_blocks data 23826 1726867422.85214: done generating all_blocks data 23826 1726867422.85215: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 23826 1726867422.85216: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 23826 1726867422.85219: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 23826 1726867422.85373: in VariableManager get_vars() 23826 1726867422.85393: done with get_vars() 23826 1726867422.85512: done processing included file 23826 1726867422.85520: iterating over new_blocks loaded from include file 23826 1726867422.85522: in VariableManager get_vars() 23826 1726867422.85541: done with get_vars() 23826 1726867422.85543: filtering new block on tags 23826 1726867422.85559: done filtering new block on tags 23826 1726867422.85561: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 23826 1726867422.85565: extending task lists for all hosts with included blocks 23826 1726867422.86672: done extending task lists 23826 1726867422.86673: done processing included files 23826 1726867422.86674: results queue empty 23826 1726867422.86675: checking for any_errors_fatal 23826 1726867422.86679: done checking for any_errors_fatal 23826 1726867422.86680: checking for max_fail_percentage 23826 1726867422.86681: done checking for max_fail_percentage 23826 1726867422.86681: checking to see if all hosts have failed and the running result is not ok 23826 1726867422.86682: done checking to see if all hosts have failed 23826 1726867422.86683: getting the remaining hosts for this loop 23826 1726867422.86684: done getting the remaining hosts for this loop 23826 1726867422.86687: getting the next task for host managed_node2 23826 1726867422.86690: done getting next task for host managed_node2 23826 1726867422.86692: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 23826 1726867422.86694: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867422.86697: getting variables 23826 1726867422.86698: in VariableManager get_vars() 23826 1726867422.86708: Calling all_inventory to load vars for managed_node2 23826 1726867422.86710: Calling groups_inventory to load vars for managed_node2 23826 1726867422.86712: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867422.86717: Calling all_plugins_play to load vars for managed_node2 23826 1726867422.86719: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867422.86722: Calling groups_plugins_play to load vars for managed_node2 23826 1726867422.86910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.87103: done with get_vars() 23826 1726867422.87120: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:23:42 -0400 (0:00:00.038) 0:00:04.883 ****** 23826 1726867422.87192: entering _queue_task() for managed_node2/include_tasks 23826 1726867422.87442: worker is 1 (out of 1 available) 23826 1726867422.87459: exiting _queue_task() for managed_node2/include_tasks 23826 1726867422.87470: done queuing things up, now waiting for results queue to drain 23826 1726867422.87471: waiting for pending results... 23826 1726867422.87896: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 23826 1726867422.87902: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000115 23826 1726867422.87905: variable 'ansible_search_path' from source: unknown 23826 1726867422.87911: variable 'ansible_search_path' from source: unknown 23826 1726867422.87914: calling self._execute() 23826 1726867422.87956: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867422.87962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867422.87972: variable 'omit' from source: magic vars 23826 1726867422.88351: variable 'ansible_distribution_major_version' from source: facts 23826 1726867422.88361: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867422.88367: _execute() done 23826 1726867422.88370: dumping result to json 23826 1726867422.88373: done dumping result, returning 23826 1726867422.88382: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-a92d-a3ea-000000000115] 23826 1726867422.88387: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000115 23826 1726867422.88470: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000115 23826 1726867422.88474: WORKER PROCESS EXITING 23826 1726867422.88503: no more pending results, returning what we have 23826 1726867422.88508: in VariableManager get_vars() 23826 1726867422.88680: Calling all_inventory to load vars for managed_node2 23826 1726867422.88683: Calling groups_inventory to load vars for managed_node2 23826 1726867422.88685: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867422.88693: Calling all_plugins_play to load vars for managed_node2 23826 1726867422.88696: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867422.88699: Calling groups_plugins_play to load vars for managed_node2 23826 1726867422.88866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.89087: done with get_vars() 23826 1726867422.89094: variable 'ansible_search_path' from source: unknown 23826 1726867422.89095: variable 'ansible_search_path' from source: unknown 23826 1726867422.89135: we have included files to process 23826 1726867422.89136: generating all_blocks data 23826 1726867422.89138: done generating all_blocks data 23826 1726867422.89139: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 23826 1726867422.89140: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 23826 1726867422.89142: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 23826 1726867422.89423: done processing included file 23826 1726867422.89424: iterating over new_blocks loaded from include file 23826 1726867422.89426: in VariableManager get_vars() 23826 1726867422.89447: done with get_vars() 23826 1726867422.89449: filtering new block on tags 23826 1726867422.89463: done filtering new block on tags 23826 1726867422.89464: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 23826 1726867422.89468: extending task lists for all hosts with included blocks 23826 1726867422.89567: done extending task lists 23826 1726867422.89568: done processing included files 23826 1726867422.89569: results queue empty 23826 1726867422.89570: checking for any_errors_fatal 23826 1726867422.89572: done checking for any_errors_fatal 23826 1726867422.89573: checking for max_fail_percentage 23826 1726867422.89574: done checking for max_fail_percentage 23826 1726867422.89575: checking to see if all hosts have failed and the running result is not ok 23826 1726867422.89575: done checking to see if all hosts have failed 23826 1726867422.89576: getting the remaining hosts for this loop 23826 1726867422.89579: done getting the remaining hosts for this loop 23826 1726867422.89582: getting the next task for host managed_node2 23826 1726867422.89586: done getting next task for host managed_node2 23826 1726867422.89588: ^ task is: TASK: Gather current interface info 23826 1726867422.89591: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867422.89594: getting variables 23826 1726867422.89595: in VariableManager get_vars() 23826 1726867422.89605: Calling all_inventory to load vars for managed_node2 23826 1726867422.89608: Calling groups_inventory to load vars for managed_node2 23826 1726867422.89610: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867422.89614: Calling all_plugins_play to load vars for managed_node2 23826 1726867422.89617: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867422.89620: Calling groups_plugins_play to load vars for managed_node2 23826 1726867422.89772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867422.89964: done with get_vars() 23826 1726867422.89972: done getting variables 23826 1726867422.90022: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:23:42 -0400 (0:00:00.028) 0:00:04.911 ****** 23826 1726867422.90049: entering _queue_task() for managed_node2/command 23826 1726867422.90276: worker is 1 (out of 1 available) 23826 1726867422.90288: exiting _queue_task() for managed_node2/command 23826 1726867422.90298: done queuing things up, now waiting for results queue to drain 23826 1726867422.90299: waiting for pending results... 23826 1726867422.90694: running TaskExecutor() for managed_node2/TASK: Gather current interface info 23826 1726867422.90699: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000192 23826 1726867422.90702: variable 'ansible_search_path' from source: unknown 23826 1726867422.90704: variable 'ansible_search_path' from source: unknown 23826 1726867422.90706: calling self._execute() 23826 1726867422.90739: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867422.90750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867422.90763: variable 'omit' from source: magic vars 23826 1726867422.91159: variable 'ansible_distribution_major_version' from source: facts 23826 1726867422.91174: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867422.91186: variable 'omit' from source: magic vars 23826 1726867422.91236: variable 'omit' from source: magic vars 23826 1726867422.91275: variable 'omit' from source: magic vars 23826 1726867422.91321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867422.91361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867422.91385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867422.91409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867422.91425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867422.91455: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867422.91463: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867422.91470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867422.91571: Set connection var ansible_timeout to 10 23826 1726867422.91589: Set connection var ansible_shell_executable to /bin/sh 23826 1726867422.91597: Set connection var ansible_connection to ssh 23826 1726867422.91612: Set connection var ansible_pipelining to False 23826 1726867422.91621: Set connection var ansible_shell_type to sh 23826 1726867422.91632: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867422.91659: variable 'ansible_shell_executable' from source: unknown 23826 1726867422.91667: variable 'ansible_connection' from source: unknown 23826 1726867422.91675: variable 'ansible_module_compression' from source: unknown 23826 1726867422.91850: variable 'ansible_shell_type' from source: unknown 23826 1726867422.91854: variable 'ansible_shell_executable' from source: unknown 23826 1726867422.91856: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867422.91858: variable 'ansible_pipelining' from source: unknown 23826 1726867422.91860: variable 'ansible_timeout' from source: unknown 23826 1726867422.91862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867422.91865: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867422.91867: variable 'omit' from source: magic vars 23826 1726867422.91869: starting attempt loop 23826 1726867422.91872: running the handler 23826 1726867422.91931: _low_level_execute_command(): starting 23826 1726867422.91939: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867422.92714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867422.92736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867422.92792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867422.92855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867422.92867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867422.92889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867422.92972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867422.95336: stdout chunk (state=3): >>>/root <<< 23826 1726867422.95493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867422.95513: stderr chunk (state=3): >>><<< 23826 1726867422.95524: stdout chunk (state=3): >>><<< 23826 1726867422.95560: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867422.95664: _low_level_execute_command(): starting 23826 1726867422.95668: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016 `" && echo ansible-tmp-1726867422.9556816-24122-220727239697016="` echo /root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016 `" ) && sleep 0' 23826 1726867422.96265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867422.96290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867422.96350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867422.96425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867422.96450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867422.96483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867422.96563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867422.99422: stdout chunk (state=3): >>>ansible-tmp-1726867422.9556816-24122-220727239697016=/root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016 <<< 23826 1726867422.99534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867422.99584: stderr chunk (state=3): >>><<< 23826 1726867422.99587: stdout chunk (state=3): >>><<< 23826 1726867422.99590: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867422.9556816-24122-220727239697016=/root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867422.99616: variable 'ansible_module_compression' from source: unknown 23826 1726867422.99791: ANSIBALLZ: Using generic lock for ansible.legacy.command 23826 1726867422.99795: ANSIBALLZ: Acquiring lock 23826 1726867422.99797: ANSIBALLZ: Lock acquired: 139851310993328 23826 1726867422.99799: ANSIBALLZ: Creating module 23826 1726867423.11289: ANSIBALLZ: Writing module into payload 23826 1726867423.11390: ANSIBALLZ: Writing module 23826 1726867423.11418: ANSIBALLZ: Renaming module 23826 1726867423.11431: ANSIBALLZ: Done creating module 23826 1726867423.11453: variable 'ansible_facts' from source: unknown 23826 1726867423.11537: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/AnsiballZ_command.py 23826 1726867423.11709: Sending initial data 23826 1726867423.11713: Sent initial data (156 bytes) 23826 1726867423.12394: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867423.12447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867423.12472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867423.12493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867423.12573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867423.15018: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867423.15059: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867423.15122: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpfq2vudnb /root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/AnsiballZ_command.py <<< 23826 1726867423.15125: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/AnsiballZ_command.py" <<< 23826 1726867423.15169: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpfq2vudnb" to remote "/root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/AnsiballZ_command.py" <<< 23826 1726867423.16836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867423.16839: stdout chunk (state=3): >>><<< 23826 1726867423.16841: stderr chunk (state=3): >>><<< 23826 1726867423.16842: done transferring module to remote 23826 1726867423.16844: _low_level_execute_command(): starting 23826 1726867423.16847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/ /root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/AnsiballZ_command.py && sleep 0' 23826 1726867423.18075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867423.18287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867423.18416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867423.18461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867423.20992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867423.21031: stderr chunk (state=3): >>><<< 23826 1726867423.21048: stdout chunk (state=3): >>><<< 23826 1726867423.21069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867423.21083: _low_level_execute_command(): starting 23826 1726867423.21092: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/AnsiballZ_command.py && sleep 0' 23826 1726867423.21683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867423.21696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867423.21715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867423.21747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867423.21794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867423.21879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867423.21895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867423.21984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867423.46938: stdout chunk (state=3): >>> <<< 23826 1726867423.46969: stdout chunk (state=3): >>>{"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:23:43.463140", "end": "2024-09-20 17:23:43.467769", "delta": "0:00:00.004629", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867423.49393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867423.49397: stdout chunk (state=3): >>><<< 23826 1726867423.49399: stderr chunk (state=3): >>><<< 23826 1726867423.49402: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:23:43.463140", "end": "2024-09-20 17:23:43.467769", "delta": "0:00:00.004629", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867423.49404: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867423.49406: _low_level_execute_command(): starting 23826 1726867423.49411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867422.9556816-24122-220727239697016/ > /dev/null 2>&1 && sleep 0' 23826 1726867423.50172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867423.50211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867423.50235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867423.50258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867423.50350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867423.52955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867423.52972: stdout chunk (state=3): >>><<< 23826 1726867423.52995: stderr chunk (state=3): >>><<< 23826 1726867423.53018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867423.53030: handler run complete 23826 1726867423.53072: Evaluated conditional (False): False 23826 1726867423.53076: attempt loop complete, returning result 23826 1726867423.53186: _execute() done 23826 1726867423.53189: dumping result to json 23826 1726867423.53191: done dumping result, returning 23826 1726867423.53193: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affcac9-a3a5-a92d-a3ea-000000000192] 23826 1726867423.53196: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000192 23826 1726867423.53269: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000192 23826 1726867423.53272: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004629", "end": "2024-09-20 17:23:43.467769", "rc": 0, "start": "2024-09-20 17:23:43.463140" } STDOUT: bonding_masters eth0 lo 23826 1726867423.53361: no more pending results, returning what we have 23826 1726867423.53365: results queue empty 23826 1726867423.53366: checking for any_errors_fatal 23826 1726867423.53367: done checking for any_errors_fatal 23826 1726867423.53368: checking for max_fail_percentage 23826 1726867423.53370: done checking for max_fail_percentage 23826 1726867423.53371: checking to see if all hosts have failed and the running result is not ok 23826 1726867423.53372: done checking to see if all hosts have failed 23826 1726867423.53373: getting the remaining hosts for this loop 23826 1726867423.53374: done getting the remaining hosts for this loop 23826 1726867423.53586: getting the next task for host managed_node2 23826 1726867423.53593: done getting next task for host managed_node2 23826 1726867423.53596: ^ task is: TASK: Set current_interfaces 23826 1726867423.53600: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867423.53604: getting variables 23826 1726867423.53667: in VariableManager get_vars() 23826 1726867423.53712: Calling all_inventory to load vars for managed_node2 23826 1726867423.53716: Calling groups_inventory to load vars for managed_node2 23826 1726867423.53719: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.53729: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.53732: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.53735: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.54041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.54253: done with get_vars() 23826 1726867423.54262: done getting variables 23826 1726867423.54319: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:23:43 -0400 (0:00:00.643) 0:00:05.554 ****** 23826 1726867423.54357: entering _queue_task() for managed_node2/set_fact 23826 1726867423.54628: worker is 1 (out of 1 available) 23826 1726867423.54639: exiting _queue_task() for managed_node2/set_fact 23826 1726867423.54650: done queuing things up, now waiting for results queue to drain 23826 1726867423.54652: waiting for pending results... 23826 1726867423.55105: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 23826 1726867423.55112: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000193 23826 1726867423.55116: variable 'ansible_search_path' from source: unknown 23826 1726867423.55119: variable 'ansible_search_path' from source: unknown 23826 1726867423.55121: calling self._execute() 23826 1726867423.55206: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.55228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.55243: variable 'omit' from source: magic vars 23826 1726867423.55622: variable 'ansible_distribution_major_version' from source: facts 23826 1726867423.55645: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867423.55663: variable 'omit' from source: magic vars 23826 1726867423.55755: variable 'omit' from source: magic vars 23826 1726867423.55827: variable '_current_interfaces' from source: set_fact 23826 1726867423.55907: variable 'omit' from source: magic vars 23826 1726867423.55956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867423.56011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867423.56036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867423.56059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867423.56089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867423.56182: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867423.56191: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.56197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.56227: Set connection var ansible_timeout to 10 23826 1726867423.56239: Set connection var ansible_shell_executable to /bin/sh 23826 1726867423.56245: Set connection var ansible_connection to ssh 23826 1726867423.56254: Set connection var ansible_pipelining to False 23826 1726867423.56259: Set connection var ansible_shell_type to sh 23826 1726867423.56266: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867423.56299: variable 'ansible_shell_executable' from source: unknown 23826 1726867423.56312: variable 'ansible_connection' from source: unknown 23826 1726867423.56319: variable 'ansible_module_compression' from source: unknown 23826 1726867423.56324: variable 'ansible_shell_type' from source: unknown 23826 1726867423.56329: variable 'ansible_shell_executable' from source: unknown 23826 1726867423.56334: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.56340: variable 'ansible_pipelining' from source: unknown 23826 1726867423.56345: variable 'ansible_timeout' from source: unknown 23826 1726867423.56350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.56497: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867423.56628: variable 'omit' from source: magic vars 23826 1726867423.56631: starting attempt loop 23826 1726867423.56633: running the handler 23826 1726867423.56635: handler run complete 23826 1726867423.56637: attempt loop complete, returning result 23826 1726867423.56639: _execute() done 23826 1726867423.56642: dumping result to json 23826 1726867423.56644: done dumping result, returning 23826 1726867423.56646: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affcac9-a3a5-a92d-a3ea-000000000193] 23826 1726867423.56648: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000193 23826 1726867423.56714: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000193 23826 1726867423.56717: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 23826 1726867423.56779: no more pending results, returning what we have 23826 1726867423.56783: results queue empty 23826 1726867423.56784: checking for any_errors_fatal 23826 1726867423.56792: done checking for any_errors_fatal 23826 1726867423.56792: checking for max_fail_percentage 23826 1726867423.56794: done checking for max_fail_percentage 23826 1726867423.56795: checking to see if all hosts have failed and the running result is not ok 23826 1726867423.56796: done checking to see if all hosts have failed 23826 1726867423.56797: getting the remaining hosts for this loop 23826 1726867423.56798: done getting the remaining hosts for this loop 23826 1726867423.56803: getting the next task for host managed_node2 23826 1726867423.56811: done getting next task for host managed_node2 23826 1726867423.56813: ^ task is: TASK: Show current_interfaces 23826 1726867423.56816: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867423.56821: getting variables 23826 1726867423.56822: in VariableManager get_vars() 23826 1726867423.56861: Calling all_inventory to load vars for managed_node2 23826 1726867423.56864: Calling groups_inventory to load vars for managed_node2 23826 1726867423.56867: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.56981: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.56985: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.56994: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.57343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.57590: done with get_vars() 23826 1726867423.57600: done getting variables 23826 1726867423.57708: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:23:43 -0400 (0:00:00.033) 0:00:05.588 ****** 23826 1726867423.57743: entering _queue_task() for managed_node2/debug 23826 1726867423.57745: Creating lock for debug 23826 1726867423.58035: worker is 1 (out of 1 available) 23826 1726867423.58055: exiting _queue_task() for managed_node2/debug 23826 1726867423.58067: done queuing things up, now waiting for results queue to drain 23826 1726867423.58069: waiting for pending results... 23826 1726867423.58495: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 23826 1726867423.58500: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000116 23826 1726867423.58503: variable 'ansible_search_path' from source: unknown 23826 1726867423.58506: variable 'ansible_search_path' from source: unknown 23826 1726867423.58512: calling self._execute() 23826 1726867423.58629: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.58635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.58638: variable 'omit' from source: magic vars 23826 1726867423.58967: variable 'ansible_distribution_major_version' from source: facts 23826 1726867423.58986: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867423.58998: variable 'omit' from source: magic vars 23826 1726867423.59040: variable 'omit' from source: magic vars 23826 1726867423.59146: variable 'current_interfaces' from source: set_fact 23826 1726867423.59186: variable 'omit' from source: magic vars 23826 1726867423.59233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867423.59383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867423.59387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867423.59390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867423.59393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867423.59396: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867423.59398: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.59401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.59476: Set connection var ansible_timeout to 10 23826 1726867423.59496: Set connection var ansible_shell_executable to /bin/sh 23826 1726867423.59502: Set connection var ansible_connection to ssh 23826 1726867423.59515: Set connection var ansible_pipelining to False 23826 1726867423.59521: Set connection var ansible_shell_type to sh 23826 1726867423.59528: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867423.59551: variable 'ansible_shell_executable' from source: unknown 23826 1726867423.59557: variable 'ansible_connection' from source: unknown 23826 1726867423.59563: variable 'ansible_module_compression' from source: unknown 23826 1726867423.59568: variable 'ansible_shell_type' from source: unknown 23826 1726867423.59574: variable 'ansible_shell_executable' from source: unknown 23826 1726867423.59581: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.59587: variable 'ansible_pipelining' from source: unknown 23826 1726867423.59592: variable 'ansible_timeout' from source: unknown 23826 1726867423.59604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.59736: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867423.59752: variable 'omit' from source: magic vars 23826 1726867423.59760: starting attempt loop 23826 1726867423.59766: running the handler 23826 1726867423.59816: handler run complete 23826 1726867423.59833: attempt loop complete, returning result 23826 1726867423.59839: _execute() done 23826 1726867423.59881: dumping result to json 23826 1726867423.59883: done dumping result, returning 23826 1726867423.59886: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affcac9-a3a5-a92d-a3ea-000000000116] 23826 1726867423.59888: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000116 23826 1726867423.60122: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000116 23826 1726867423.60126: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 23826 1726867423.60164: no more pending results, returning what we have 23826 1726867423.60167: results queue empty 23826 1726867423.60168: checking for any_errors_fatal 23826 1726867423.60173: done checking for any_errors_fatal 23826 1726867423.60173: checking for max_fail_percentage 23826 1726867423.60174: done checking for max_fail_percentage 23826 1726867423.60175: checking to see if all hosts have failed and the running result is not ok 23826 1726867423.60176: done checking to see if all hosts have failed 23826 1726867423.60178: getting the remaining hosts for this loop 23826 1726867423.60179: done getting the remaining hosts for this loop 23826 1726867423.60182: getting the next task for host managed_node2 23826 1726867423.60188: done getting next task for host managed_node2 23826 1726867423.60190: ^ task is: TASK: Include the task 'manage_test_interface.yml' 23826 1726867423.60192: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867423.60195: getting variables 23826 1726867423.60196: in VariableManager get_vars() 23826 1726867423.60232: Calling all_inventory to load vars for managed_node2 23826 1726867423.60235: Calling groups_inventory to load vars for managed_node2 23826 1726867423.60238: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.60247: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.60250: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.60253: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.60427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.60625: done with get_vars() 23826 1726867423.60635: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:16 Friday 20 September 2024 17:23:43 -0400 (0:00:00.029) 0:00:05.618 ****** 23826 1726867423.60728: entering _queue_task() for managed_node2/include_tasks 23826 1726867423.61095: worker is 1 (out of 1 available) 23826 1726867423.61106: exiting _queue_task() for managed_node2/include_tasks 23826 1726867423.61116: done queuing things up, now waiting for results queue to drain 23826 1726867423.61117: waiting for pending results... 23826 1726867423.61271: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 23826 1726867423.61369: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000000d 23826 1726867423.61420: variable 'ansible_search_path' from source: unknown 23826 1726867423.61441: calling self._execute() 23826 1726867423.61532: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.61541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.61558: variable 'omit' from source: magic vars 23826 1726867423.61965: variable 'ansible_distribution_major_version' from source: facts 23826 1726867423.61968: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867423.61970: _execute() done 23826 1726867423.61972: dumping result to json 23826 1726867423.61975: done dumping result, returning 23826 1726867423.61979: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [0affcac9-a3a5-a92d-a3ea-00000000000d] 23826 1726867423.61981: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000d 23826 1726867423.62101: no more pending results, returning what we have 23826 1726867423.62106: in VariableManager get_vars() 23826 1726867423.62147: Calling all_inventory to load vars for managed_node2 23826 1726867423.62150: Calling groups_inventory to load vars for managed_node2 23826 1726867423.62153: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.62166: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.62169: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.62172: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.62554: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000d 23826 1726867423.62557: WORKER PROCESS EXITING 23826 1726867423.62586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.62784: done with get_vars() 23826 1726867423.62796: variable 'ansible_search_path' from source: unknown 23826 1726867423.62808: we have included files to process 23826 1726867423.62809: generating all_blocks data 23826 1726867423.62811: done generating all_blocks data 23826 1726867423.62814: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 23826 1726867423.62815: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 23826 1726867423.62818: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 23826 1726867423.63361: in VariableManager get_vars() 23826 1726867423.63382: done with get_vars() 23826 1726867423.63606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 23826 1726867423.64445: done processing included file 23826 1726867423.64447: iterating over new_blocks loaded from include file 23826 1726867423.64449: in VariableManager get_vars() 23826 1726867423.64464: done with get_vars() 23826 1726867423.64466: filtering new block on tags 23826 1726867423.64498: done filtering new block on tags 23826 1726867423.64500: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 23826 1726867423.64505: extending task lists for all hosts with included blocks 23826 1726867423.65718: done extending task lists 23826 1726867423.65720: done processing included files 23826 1726867423.65721: results queue empty 23826 1726867423.65721: checking for any_errors_fatal 23826 1726867423.65724: done checking for any_errors_fatal 23826 1726867423.65725: checking for max_fail_percentage 23826 1726867423.65726: done checking for max_fail_percentage 23826 1726867423.65726: checking to see if all hosts have failed and the running result is not ok 23826 1726867423.65727: done checking to see if all hosts have failed 23826 1726867423.65728: getting the remaining hosts for this loop 23826 1726867423.65729: done getting the remaining hosts for this loop 23826 1726867423.65731: getting the next task for host managed_node2 23826 1726867423.65735: done getting next task for host managed_node2 23826 1726867423.65737: ^ task is: TASK: Ensure state in ["present", "absent"] 23826 1726867423.65746: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867423.65748: getting variables 23826 1726867423.65749: in VariableManager get_vars() 23826 1726867423.65760: Calling all_inventory to load vars for managed_node2 23826 1726867423.65762: Calling groups_inventory to load vars for managed_node2 23826 1726867423.65764: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.65769: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.65771: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.65774: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.65935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.66126: done with get_vars() 23826 1726867423.66135: done getting variables 23826 1726867423.66203: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 17:23:43 -0400 (0:00:00.054) 0:00:05.673 ****** 23826 1726867423.66228: entering _queue_task() for managed_node2/fail 23826 1726867423.66230: Creating lock for fail 23826 1726867423.66498: worker is 1 (out of 1 available) 23826 1726867423.66517: exiting _queue_task() for managed_node2/fail 23826 1726867423.66528: done queuing things up, now waiting for results queue to drain 23826 1726867423.66530: waiting for pending results... 23826 1726867423.66823: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 23826 1726867423.67069: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001ae 23826 1726867423.67383: variable 'ansible_search_path' from source: unknown 23826 1726867423.67387: variable 'ansible_search_path' from source: unknown 23826 1726867423.67390: calling self._execute() 23826 1726867423.67428: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.67440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.67830: variable 'omit' from source: magic vars 23826 1726867423.68311: variable 'ansible_distribution_major_version' from source: facts 23826 1726867423.68327: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867423.68462: variable 'state' from source: include params 23826 1726867423.68692: Evaluated conditional (state not in ["present", "absent"]): False 23826 1726867423.68721: when evaluation is False, skipping this task 23826 1726867423.68730: _execute() done 23826 1726867423.68738: dumping result to json 23826 1726867423.68745: done dumping result, returning 23826 1726867423.68756: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0affcac9-a3a5-a92d-a3ea-0000000001ae] 23826 1726867423.68766: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001ae skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 23826 1726867423.68913: no more pending results, returning what we have 23826 1726867423.68916: results queue empty 23826 1726867423.68917: checking for any_errors_fatal 23826 1726867423.68919: done checking for any_errors_fatal 23826 1726867423.68919: checking for max_fail_percentage 23826 1726867423.68921: done checking for max_fail_percentage 23826 1726867423.68921: checking to see if all hosts have failed and the running result is not ok 23826 1726867423.68922: done checking to see if all hosts have failed 23826 1726867423.68923: getting the remaining hosts for this loop 23826 1726867423.68924: done getting the remaining hosts for this loop 23826 1726867423.68928: getting the next task for host managed_node2 23826 1726867423.68933: done getting next task for host managed_node2 23826 1726867423.68935: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 23826 1726867423.68939: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867423.68941: getting variables 23826 1726867423.68943: in VariableManager get_vars() 23826 1726867423.68982: Calling all_inventory to load vars for managed_node2 23826 1726867423.68985: Calling groups_inventory to load vars for managed_node2 23826 1726867423.68988: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.68999: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.69002: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.69005: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.69391: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001ae 23826 1726867423.69395: WORKER PROCESS EXITING 23826 1726867423.69410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.69885: done with get_vars() 23826 1726867423.69894: done getting variables 23826 1726867423.69951: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 17:23:43 -0400 (0:00:00.037) 0:00:05.711 ****** 23826 1726867423.69980: entering _queue_task() for managed_node2/fail 23826 1726867423.70250: worker is 1 (out of 1 available) 23826 1726867423.70263: exiting _queue_task() for managed_node2/fail 23826 1726867423.70274: done queuing things up, now waiting for results queue to drain 23826 1726867423.70276: waiting for pending results... 23826 1726867423.70543: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 23826 1726867423.70634: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001af 23826 1726867423.70647: variable 'ansible_search_path' from source: unknown 23826 1726867423.70651: variable 'ansible_search_path' from source: unknown 23826 1726867423.70686: calling self._execute() 23826 1726867423.70774: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.70782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.70793: variable 'omit' from source: magic vars 23826 1726867423.71136: variable 'ansible_distribution_major_version' from source: facts 23826 1726867423.71151: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867423.71286: variable 'type' from source: set_fact 23826 1726867423.71291: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 23826 1726867423.71295: when evaluation is False, skipping this task 23826 1726867423.71297: _execute() done 23826 1726867423.71300: dumping result to json 23826 1726867423.71305: done dumping result, returning 23826 1726867423.71312: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcac9-a3a5-a92d-a3ea-0000000001af] 23826 1726867423.71316: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001af 23826 1726867423.71531: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001af 23826 1726867423.71535: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 23826 1726867423.71567: no more pending results, returning what we have 23826 1726867423.71570: results queue empty 23826 1726867423.71571: checking for any_errors_fatal 23826 1726867423.71576: done checking for any_errors_fatal 23826 1726867423.71576: checking for max_fail_percentage 23826 1726867423.71579: done checking for max_fail_percentage 23826 1726867423.71580: checking to see if all hosts have failed and the running result is not ok 23826 1726867423.71581: done checking to see if all hosts have failed 23826 1726867423.71582: getting the remaining hosts for this loop 23826 1726867423.71583: done getting the remaining hosts for this loop 23826 1726867423.71586: getting the next task for host managed_node2 23826 1726867423.71590: done getting next task for host managed_node2 23826 1726867423.71592: ^ task is: TASK: Include the task 'show_interfaces.yml' 23826 1726867423.71595: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867423.71598: getting variables 23826 1726867423.71599: in VariableManager get_vars() 23826 1726867423.71630: Calling all_inventory to load vars for managed_node2 23826 1726867423.71633: Calling groups_inventory to load vars for managed_node2 23826 1726867423.71635: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.71644: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.71646: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.71649: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.71818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.72250: done with get_vars() 23826 1726867423.72259: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 17:23:43 -0400 (0:00:00.023) 0:00:05.734 ****** 23826 1726867423.72347: entering _queue_task() for managed_node2/include_tasks 23826 1726867423.72766: worker is 1 (out of 1 available) 23826 1726867423.72981: exiting _queue_task() for managed_node2/include_tasks 23826 1726867423.72991: done queuing things up, now waiting for results queue to drain 23826 1726867423.72993: waiting for pending results... 23826 1726867423.73796: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 23826 1726867423.73800: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b0 23826 1726867423.73804: variable 'ansible_search_path' from source: unknown 23826 1726867423.73807: variable 'ansible_search_path' from source: unknown 23826 1726867423.73810: calling self._execute() 23826 1726867423.74542: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.74545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.74549: variable 'omit' from source: magic vars 23826 1726867423.74961: variable 'ansible_distribution_major_version' from source: facts 23826 1726867423.75102: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867423.75114: _execute() done 23826 1726867423.75123: dumping result to json 23826 1726867423.75131: done dumping result, returning 23826 1726867423.75142: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-a92d-a3ea-0000000001b0] 23826 1726867423.75151: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b0 23826 1726867423.75404: no more pending results, returning what we have 23826 1726867423.75412: in VariableManager get_vars() 23826 1726867423.75454: Calling all_inventory to load vars for managed_node2 23826 1726867423.75456: Calling groups_inventory to load vars for managed_node2 23826 1726867423.75459: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.75472: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.75474: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.75480: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.75815: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b0 23826 1726867423.75819: WORKER PROCESS EXITING 23826 1726867423.75834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.76333: done with get_vars() 23826 1726867423.76340: variable 'ansible_search_path' from source: unknown 23826 1726867423.76341: variable 'ansible_search_path' from source: unknown 23826 1726867423.76373: we have included files to process 23826 1726867423.76374: generating all_blocks data 23826 1726867423.76375: done generating all_blocks data 23826 1726867423.76380: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 23826 1726867423.76381: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 23826 1726867423.76383: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 23826 1726867423.76681: in VariableManager get_vars() 23826 1726867423.76702: done with get_vars() 23826 1726867423.76815: done processing included file 23826 1726867423.76818: iterating over new_blocks loaded from include file 23826 1726867423.76819: in VariableManager get_vars() 23826 1726867423.76835: done with get_vars() 23826 1726867423.76837: filtering new block on tags 23826 1726867423.76856: done filtering new block on tags 23826 1726867423.76858: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 23826 1726867423.76862: extending task lists for all hosts with included blocks 23826 1726867423.77879: done extending task lists 23826 1726867423.77881: done processing included files 23826 1726867423.77881: results queue empty 23826 1726867423.77882: checking for any_errors_fatal 23826 1726867423.77885: done checking for any_errors_fatal 23826 1726867423.77886: checking for max_fail_percentage 23826 1726867423.77887: done checking for max_fail_percentage 23826 1726867423.77888: checking to see if all hosts have failed and the running result is not ok 23826 1726867423.77889: done checking to see if all hosts have failed 23826 1726867423.77890: getting the remaining hosts for this loop 23826 1726867423.77891: done getting the remaining hosts for this loop 23826 1726867423.77893: getting the next task for host managed_node2 23826 1726867423.77898: done getting next task for host managed_node2 23826 1726867423.77900: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 23826 1726867423.77902: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867423.77905: getting variables 23826 1726867423.77906: in VariableManager get_vars() 23826 1726867423.77918: Calling all_inventory to load vars for managed_node2 23826 1726867423.77920: Calling groups_inventory to load vars for managed_node2 23826 1726867423.77922: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.77927: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.77929: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.77932: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.78072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.78695: done with get_vars() 23826 1726867423.78705: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:23:43 -0400 (0:00:00.064) 0:00:05.799 ****** 23826 1726867423.78773: entering _queue_task() for managed_node2/include_tasks 23826 1726867423.79175: worker is 1 (out of 1 available) 23826 1726867423.79192: exiting _queue_task() for managed_node2/include_tasks 23826 1726867423.79204: done queuing things up, now waiting for results queue to drain 23826 1726867423.79205: waiting for pending results... 23826 1726867423.79690: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 23826 1726867423.79918: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000245 23826 1726867423.80037: variable 'ansible_search_path' from source: unknown 23826 1726867423.80040: variable 'ansible_search_path' from source: unknown 23826 1726867423.80145: calling self._execute() 23826 1726867423.80282: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.80295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.80310: variable 'omit' from source: magic vars 23826 1726867423.81248: variable 'ansible_distribution_major_version' from source: facts 23826 1726867423.81302: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867423.81314: _execute() done 23826 1726867423.81340: dumping result to json 23826 1726867423.81349: done dumping result, returning 23826 1726867423.81444: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-a92d-a3ea-000000000245] 23826 1726867423.81448: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000245 23826 1726867423.81587: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000245 23826 1726867423.81590: WORKER PROCESS EXITING 23826 1726867423.81625: no more pending results, returning what we have 23826 1726867423.81630: in VariableManager get_vars() 23826 1726867423.81679: Calling all_inventory to load vars for managed_node2 23826 1726867423.81683: Calling groups_inventory to load vars for managed_node2 23826 1726867423.81685: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.81700: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.81703: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.81706: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.82014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.82442: done with get_vars() 23826 1726867423.82449: variable 'ansible_search_path' from source: unknown 23826 1726867423.82451: variable 'ansible_search_path' from source: unknown 23826 1726867423.82534: we have included files to process 23826 1726867423.82536: generating all_blocks data 23826 1726867423.82537: done generating all_blocks data 23826 1726867423.82541: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 23826 1726867423.82542: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 23826 1726867423.82544: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 23826 1726867423.82827: done processing included file 23826 1726867423.82829: iterating over new_blocks loaded from include file 23826 1726867423.82830: in VariableManager get_vars() 23826 1726867423.82845: done with get_vars() 23826 1726867423.82847: filtering new block on tags 23826 1726867423.82866: done filtering new block on tags 23826 1726867423.82868: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 23826 1726867423.82873: extending task lists for all hosts with included blocks 23826 1726867423.83032: done extending task lists 23826 1726867423.83033: done processing included files 23826 1726867423.83034: results queue empty 23826 1726867423.83035: checking for any_errors_fatal 23826 1726867423.83042: done checking for any_errors_fatal 23826 1726867423.83043: checking for max_fail_percentage 23826 1726867423.83044: done checking for max_fail_percentage 23826 1726867423.83045: checking to see if all hosts have failed and the running result is not ok 23826 1726867423.83046: done checking to see if all hosts have failed 23826 1726867423.83047: getting the remaining hosts for this loop 23826 1726867423.83048: done getting the remaining hosts for this loop 23826 1726867423.83050: getting the next task for host managed_node2 23826 1726867423.83055: done getting next task for host managed_node2 23826 1726867423.83057: ^ task is: TASK: Gather current interface info 23826 1726867423.83060: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867423.83062: getting variables 23826 1726867423.83063: in VariableManager get_vars() 23826 1726867423.83073: Calling all_inventory to load vars for managed_node2 23826 1726867423.83075: Calling groups_inventory to load vars for managed_node2 23826 1726867423.83078: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867423.83083: Calling all_plugins_play to load vars for managed_node2 23826 1726867423.83084: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867423.83087: Calling groups_plugins_play to load vars for managed_node2 23826 1726867423.83233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867423.83416: done with get_vars() 23826 1726867423.83424: done getting variables 23826 1726867423.83462: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:23:43 -0400 (0:00:00.047) 0:00:05.846 ****** 23826 1726867423.83496: entering _queue_task() for managed_node2/command 23826 1726867423.83732: worker is 1 (out of 1 available) 23826 1726867423.83743: exiting _queue_task() for managed_node2/command 23826 1726867423.83756: done queuing things up, now waiting for results queue to drain 23826 1726867423.83757: waiting for pending results... 23826 1726867423.84096: running TaskExecutor() for managed_node2/TASK: Gather current interface info 23826 1726867423.84125: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000027c 23826 1726867423.84148: variable 'ansible_search_path' from source: unknown 23826 1726867423.84152: variable 'ansible_search_path' from source: unknown 23826 1726867423.84194: calling self._execute() 23826 1726867423.84303: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.84311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.84323: variable 'omit' from source: magic vars 23826 1726867423.84925: variable 'ansible_distribution_major_version' from source: facts 23826 1726867423.84930: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867423.84937: variable 'omit' from source: magic vars 23826 1726867423.84986: variable 'omit' from source: magic vars 23826 1726867423.85034: variable 'omit' from source: magic vars 23826 1726867423.85073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867423.85108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867423.85383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867423.85388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867423.85391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867423.85393: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867423.85396: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.85398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.85400: Set connection var ansible_timeout to 10 23826 1726867423.85403: Set connection var ansible_shell_executable to /bin/sh 23826 1726867423.85405: Set connection var ansible_connection to ssh 23826 1726867423.85407: Set connection var ansible_pipelining to False 23826 1726867423.85409: Set connection var ansible_shell_type to sh 23826 1726867423.85411: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867423.85414: variable 'ansible_shell_executable' from source: unknown 23826 1726867423.85416: variable 'ansible_connection' from source: unknown 23826 1726867423.85418: variable 'ansible_module_compression' from source: unknown 23826 1726867423.85420: variable 'ansible_shell_type' from source: unknown 23826 1726867423.85423: variable 'ansible_shell_executable' from source: unknown 23826 1726867423.85425: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867423.85427: variable 'ansible_pipelining' from source: unknown 23826 1726867423.85429: variable 'ansible_timeout' from source: unknown 23826 1726867423.85431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867423.85534: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867423.85546: variable 'omit' from source: magic vars 23826 1726867423.85583: starting attempt loop 23826 1726867423.85587: running the handler 23826 1726867423.85590: _low_level_execute_command(): starting 23826 1726867423.85592: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867423.86402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867423.86445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867423.86457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867423.86474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867423.86559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867423.89283: stdout chunk (state=3): >>>/root <<< 23826 1726867423.89286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867423.89288: stdout chunk (state=3): >>><<< 23826 1726867423.89290: stderr chunk (state=3): >>><<< 23826 1726867423.89293: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867423.89295: _low_level_execute_command(): starting 23826 1726867423.89297: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009 `" && echo ansible-tmp-1726867423.891364-24168-258441882986009="` echo /root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009 `" ) && sleep 0' 23826 1726867423.89987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867423.89990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867423.90035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867423.90110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867423.90138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867423.90224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867423.93014: stdout chunk (state=3): >>>ansible-tmp-1726867423.891364-24168-258441882986009=/root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009 <<< 23826 1726867423.93152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867423.93397: stderr chunk (state=3): >>><<< 23826 1726867423.93403: stdout chunk (state=3): >>><<< 23826 1726867423.93406: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867423.891364-24168-258441882986009=/root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867423.93411: variable 'ansible_module_compression' from source: unknown 23826 1726867423.93461: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867423.93525: variable 'ansible_facts' from source: unknown 23826 1726867423.93697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/AnsiballZ_command.py 23826 1726867423.94124: Sending initial data 23826 1726867423.94127: Sent initial data (155 bytes) 23826 1726867423.95036: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867423.95051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867423.95096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867423.95119: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867423.95202: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867423.95235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867423.95254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867423.95275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867423.95444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867423.97696: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867423.97722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867423.97789: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp_gj7cilg /root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/AnsiballZ_command.py <<< 23826 1726867423.97795: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/AnsiballZ_command.py" <<< 23826 1726867423.97899: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp_gj7cilg" to remote "/root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/AnsiballZ_command.py" <<< 23826 1726867423.99075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867423.99082: stdout chunk (state=3): >>><<< 23826 1726867423.99088: stderr chunk (state=3): >>><<< 23826 1726867423.99153: done transferring module to remote 23826 1726867423.99163: _low_level_execute_command(): starting 23826 1726867423.99168: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/ /root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/AnsiballZ_command.py && sleep 0' 23826 1726867423.99957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867423.99966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867424.00025: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867424.00080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867424.00119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867424.00156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867424.02787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867424.02813: stdout chunk (state=3): >>><<< 23826 1726867424.02816: stderr chunk (state=3): >>><<< 23826 1726867424.02902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867424.02906: _low_level_execute_command(): starting 23826 1726867424.02908: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/AnsiballZ_command.py && sleep 0' 23826 1726867424.03514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867424.03517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867424.03520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867424.03522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867424.03524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867424.03527: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867424.03529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867424.03541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867424.03548: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867424.03555: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867424.03563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867424.03572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867424.03586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867424.03594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867424.03601: stderr chunk (state=3): >>>debug2: match found <<< 23826 1726867424.03618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867424.03690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867424.03696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867424.03775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867424.24286: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:23:44.236993", "end": "2024-09-20 17:23:44.240328", "delta": "0:00:00.003335", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867424.26485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867424.26490: stdout chunk (state=3): >>><<< 23826 1726867424.26493: stderr chunk (state=3): >>><<< 23826 1726867424.26701: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:23:44.236993", "end": "2024-09-20 17:23:44.240328", "delta": "0:00:00.003335", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867424.26738: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867424.26746: _low_level_execute_command(): starting 23826 1726867424.26751: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867423.891364-24168-258441882986009/ > /dev/null 2>&1 && sleep 0' 23826 1726867424.28095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867424.28152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867424.28156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867424.28159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867424.28492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867424.28501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867424.28565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 23826 1726867424.31181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867424.31232: stderr chunk (state=3): >>><<< 23826 1726867424.31257: stdout chunk (state=3): >>><<< 23826 1726867424.31282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 23826 1726867424.31295: handler run complete 23826 1726867424.31325: Evaluated conditional (False): False 23826 1726867424.31342: attempt loop complete, returning result 23826 1726867424.31373: _execute() done 23826 1726867424.31385: dumping result to json 23826 1726867424.31396: done dumping result, returning 23826 1726867424.31412: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affcac9-a3a5-a92d-a3ea-00000000027c] 23826 1726867424.31423: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000027c 23826 1726867424.31790: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000027c 23826 1726867424.31793: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003335", "end": "2024-09-20 17:23:44.240328", "rc": 0, "start": "2024-09-20 17:23:44.236993" } STDOUT: bonding_masters eth0 lo 23826 1726867424.31874: no more pending results, returning what we have 23826 1726867424.31880: results queue empty 23826 1726867424.31882: checking for any_errors_fatal 23826 1726867424.31883: done checking for any_errors_fatal 23826 1726867424.31884: checking for max_fail_percentage 23826 1726867424.31885: done checking for max_fail_percentage 23826 1726867424.31886: checking to see if all hosts have failed and the running result is not ok 23826 1726867424.31887: done checking to see if all hosts have failed 23826 1726867424.31888: getting the remaining hosts for this loop 23826 1726867424.31889: done getting the remaining hosts for this loop 23826 1726867424.31893: getting the next task for host managed_node2 23826 1726867424.31901: done getting next task for host managed_node2 23826 1726867424.31904: ^ task is: TASK: Set current_interfaces 23826 1726867424.31912: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867424.31917: getting variables 23826 1726867424.31919: in VariableManager get_vars() 23826 1726867424.31957: Calling all_inventory to load vars for managed_node2 23826 1726867424.31960: Calling groups_inventory to load vars for managed_node2 23826 1726867424.31963: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867424.31974: Calling all_plugins_play to load vars for managed_node2 23826 1726867424.32385: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867424.32391: Calling groups_plugins_play to load vars for managed_node2 23826 1726867424.32647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867424.33289: done with get_vars() 23826 1726867424.33299: done getting variables 23826 1726867424.33359: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:23:44 -0400 (0:00:00.499) 0:00:06.346 ****** 23826 1726867424.33502: entering _queue_task() for managed_node2/set_fact 23826 1726867424.33981: worker is 1 (out of 1 available) 23826 1726867424.33999: exiting _queue_task() for managed_node2/set_fact 23826 1726867424.34014: done queuing things up, now waiting for results queue to drain 23826 1726867424.34015: waiting for pending results... 23826 1726867424.34252: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 23826 1726867424.34486: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000027d 23826 1726867424.34491: variable 'ansible_search_path' from source: unknown 23826 1726867424.34495: variable 'ansible_search_path' from source: unknown 23826 1726867424.34498: calling self._execute() 23826 1726867424.34534: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.34546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.34560: variable 'omit' from source: magic vars 23826 1726867424.34914: variable 'ansible_distribution_major_version' from source: facts 23826 1726867424.34931: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867424.34943: variable 'omit' from source: magic vars 23826 1726867424.34998: variable 'omit' from source: magic vars 23826 1726867424.35105: variable '_current_interfaces' from source: set_fact 23826 1726867424.35172: variable 'omit' from source: magic vars 23826 1726867424.35218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867424.35260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867424.35289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867424.35312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867424.35482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867424.35485: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867424.35487: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.35489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.35491: Set connection var ansible_timeout to 10 23826 1726867424.35493: Set connection var ansible_shell_executable to /bin/sh 23826 1726867424.35495: Set connection var ansible_connection to ssh 23826 1726867424.35499: Set connection var ansible_pipelining to False 23826 1726867424.35506: Set connection var ansible_shell_type to sh 23826 1726867424.35516: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867424.35543: variable 'ansible_shell_executable' from source: unknown 23826 1726867424.35552: variable 'ansible_connection' from source: unknown 23826 1726867424.35559: variable 'ansible_module_compression' from source: unknown 23826 1726867424.35566: variable 'ansible_shell_type' from source: unknown 23826 1726867424.35573: variable 'ansible_shell_executable' from source: unknown 23826 1726867424.35584: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.35588: variable 'ansible_pipelining' from source: unknown 23826 1726867424.35595: variable 'ansible_timeout' from source: unknown 23826 1726867424.35602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.35740: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867424.35757: variable 'omit' from source: magic vars 23826 1726867424.35768: starting attempt loop 23826 1726867424.35775: running the handler 23826 1726867424.35832: handler run complete 23826 1726867424.35872: attempt loop complete, returning result 23826 1726867424.35915: _execute() done 23826 1726867424.35924: dumping result to json 23826 1726867424.35962: done dumping result, returning 23826 1726867424.35980: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affcac9-a3a5-a92d-a3ea-00000000027d] 23826 1726867424.36283: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000027d ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 23826 1726867424.36402: no more pending results, returning what we have 23826 1726867424.36405: results queue empty 23826 1726867424.36406: checking for any_errors_fatal 23826 1726867424.36414: done checking for any_errors_fatal 23826 1726867424.36415: checking for max_fail_percentage 23826 1726867424.36416: done checking for max_fail_percentage 23826 1726867424.36417: checking to see if all hosts have failed and the running result is not ok 23826 1726867424.36418: done checking to see if all hosts have failed 23826 1726867424.36419: getting the remaining hosts for this loop 23826 1726867424.36420: done getting the remaining hosts for this loop 23826 1726867424.36424: getting the next task for host managed_node2 23826 1726867424.36432: done getting next task for host managed_node2 23826 1726867424.36434: ^ task is: TASK: Show current_interfaces 23826 1726867424.36437: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867424.36441: getting variables 23826 1726867424.36442: in VariableManager get_vars() 23826 1726867424.36474: Calling all_inventory to load vars for managed_node2 23826 1726867424.36476: Calling groups_inventory to load vars for managed_node2 23826 1726867424.36480: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867424.36487: Calling all_plugins_play to load vars for managed_node2 23826 1726867424.36489: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867424.36492: Calling groups_plugins_play to load vars for managed_node2 23826 1726867424.36862: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000027d 23826 1726867424.36870: WORKER PROCESS EXITING 23826 1726867424.36938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867424.37369: done with get_vars() 23826 1726867424.37582: done getting variables 23826 1726867424.37641: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:23:44 -0400 (0:00:00.041) 0:00:06.388 ****** 23826 1726867424.37693: entering _queue_task() for managed_node2/debug 23826 1726867424.38156: worker is 1 (out of 1 available) 23826 1726867424.38169: exiting _queue_task() for managed_node2/debug 23826 1726867424.38181: done queuing things up, now waiting for results queue to drain 23826 1726867424.38183: waiting for pending results... 23826 1726867424.38602: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 23826 1726867424.38714: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000246 23826 1726867424.38749: variable 'ansible_search_path' from source: unknown 23826 1726867424.38759: variable 'ansible_search_path' from source: unknown 23826 1726867424.38800: calling self._execute() 23826 1726867424.38960: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.38964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.38967: variable 'omit' from source: magic vars 23826 1726867424.39299: variable 'ansible_distribution_major_version' from source: facts 23826 1726867424.39320: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867424.39331: variable 'omit' from source: magic vars 23826 1726867424.39378: variable 'omit' from source: magic vars 23826 1726867424.39483: variable 'current_interfaces' from source: set_fact 23826 1726867424.39523: variable 'omit' from source: magic vars 23826 1726867424.39566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867424.39615: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867424.39640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867424.39718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867424.39721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867424.39723: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867424.39726: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.39730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.39842: Set connection var ansible_timeout to 10 23826 1726867424.39857: Set connection var ansible_shell_executable to /bin/sh 23826 1726867424.39864: Set connection var ansible_connection to ssh 23826 1726867424.39876: Set connection var ansible_pipelining to False 23826 1726867424.39886: Set connection var ansible_shell_type to sh 23826 1726867424.39897: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867424.39931: variable 'ansible_shell_executable' from source: unknown 23826 1726867424.39945: variable 'ansible_connection' from source: unknown 23826 1726867424.39982: variable 'ansible_module_compression' from source: unknown 23826 1726867424.39985: variable 'ansible_shell_type' from source: unknown 23826 1726867424.39987: variable 'ansible_shell_executable' from source: unknown 23826 1726867424.39989: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.39991: variable 'ansible_pipelining' from source: unknown 23826 1726867424.39993: variable 'ansible_timeout' from source: unknown 23826 1726867424.39995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.40136: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867424.40156: variable 'omit' from source: magic vars 23826 1726867424.40330: starting attempt loop 23826 1726867424.40335: running the handler 23826 1726867424.40338: handler run complete 23826 1726867424.40340: attempt loop complete, returning result 23826 1726867424.40342: _execute() done 23826 1726867424.40344: dumping result to json 23826 1726867424.40346: done dumping result, returning 23826 1726867424.40349: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affcac9-a3a5-a92d-a3ea-000000000246] 23826 1726867424.40351: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000246 23826 1726867424.40419: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000246 23826 1726867424.40422: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 23826 1726867424.40472: no more pending results, returning what we have 23826 1726867424.40476: results queue empty 23826 1726867424.40479: checking for any_errors_fatal 23826 1726867424.40483: done checking for any_errors_fatal 23826 1726867424.40484: checking for max_fail_percentage 23826 1726867424.40489: done checking for max_fail_percentage 23826 1726867424.40490: checking to see if all hosts have failed and the running result is not ok 23826 1726867424.40491: done checking to see if all hosts have failed 23826 1726867424.40492: getting the remaining hosts for this loop 23826 1726867424.40494: done getting the remaining hosts for this loop 23826 1726867424.40498: getting the next task for host managed_node2 23826 1726867424.40512: done getting next task for host managed_node2 23826 1726867424.40515: ^ task is: TASK: Install iproute 23826 1726867424.40519: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867424.40523: getting variables 23826 1726867424.40524: in VariableManager get_vars() 23826 1726867424.40559: Calling all_inventory to load vars for managed_node2 23826 1726867424.40561: Calling groups_inventory to load vars for managed_node2 23826 1726867424.40564: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867424.40574: Calling all_plugins_play to load vars for managed_node2 23826 1726867424.40576: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867424.40886: Calling groups_plugins_play to load vars for managed_node2 23826 1726867424.41153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867424.41354: done with get_vars() 23826 1726867424.41362: done getting variables 23826 1726867424.41421: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 17:23:44 -0400 (0:00:00.037) 0:00:06.425 ****** 23826 1726867424.41450: entering _queue_task() for managed_node2/package 23826 1726867424.41671: worker is 1 (out of 1 available) 23826 1726867424.41690: exiting _queue_task() for managed_node2/package 23826 1726867424.41703: done queuing things up, now waiting for results queue to drain 23826 1726867424.41704: waiting for pending results... 23826 1726867424.42000: running TaskExecutor() for managed_node2/TASK: Install iproute 23826 1726867424.42252: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b1 23826 1726867424.42262: variable 'ansible_search_path' from source: unknown 23826 1726867424.42266: variable 'ansible_search_path' from source: unknown 23826 1726867424.42269: calling self._execute() 23826 1726867424.42568: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.42615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.42629: variable 'omit' from source: magic vars 23826 1726867424.43212: variable 'ansible_distribution_major_version' from source: facts 23826 1726867424.43216: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867424.43219: variable 'omit' from source: magic vars 23826 1726867424.43221: variable 'omit' from source: magic vars 23826 1726867424.43461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867424.45665: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867424.45747: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867424.45802: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867424.45848: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867424.45885: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867424.45987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867424.46028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867424.46060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867424.46113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867424.46133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867424.46248: variable '__network_is_ostree' from source: set_fact 23826 1726867424.46258: variable 'omit' from source: magic vars 23826 1726867424.46294: variable 'omit' from source: magic vars 23826 1726867424.46333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867424.46365: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867424.46395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867424.46424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867424.46440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867424.46472: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867424.46483: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.46491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.46598: Set connection var ansible_timeout to 10 23826 1726867424.46615: Set connection var ansible_shell_executable to /bin/sh 23826 1726867424.46622: Set connection var ansible_connection to ssh 23826 1726867424.46638: Set connection var ansible_pipelining to False 23826 1726867424.46782: Set connection var ansible_shell_type to sh 23826 1726867424.46785: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867424.46788: variable 'ansible_shell_executable' from source: unknown 23826 1726867424.46790: variable 'ansible_connection' from source: unknown 23826 1726867424.46792: variable 'ansible_module_compression' from source: unknown 23826 1726867424.46794: variable 'ansible_shell_type' from source: unknown 23826 1726867424.46796: variable 'ansible_shell_executable' from source: unknown 23826 1726867424.46797: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867424.46799: variable 'ansible_pipelining' from source: unknown 23826 1726867424.46801: variable 'ansible_timeout' from source: unknown 23826 1726867424.46803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867424.46843: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867424.46858: variable 'omit' from source: magic vars 23826 1726867424.46868: starting attempt loop 23826 1726867424.46874: running the handler 23826 1726867424.46887: variable 'ansible_facts' from source: unknown 23826 1726867424.46894: variable 'ansible_facts' from source: unknown 23826 1726867424.46948: _low_level_execute_command(): starting 23826 1726867424.46961: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867424.47649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867424.47692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867424.47709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867424.47794: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867424.47810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867424.47832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867424.47848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867424.47935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867424.50294: stdout chunk (state=3): >>>/root <<< 23826 1726867424.50464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867424.50491: stdout chunk (state=3): >>><<< 23826 1726867424.50514: stderr chunk (state=3): >>><<< 23826 1726867424.50624: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867424.50635: _low_level_execute_command(): starting 23826 1726867424.50639: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955 `" && echo ansible-tmp-1726867424.5053453-24234-141357066568955="` echo /root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955 `" ) && sleep 0' 23826 1726867424.51187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867424.51203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867424.51231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867424.51250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867424.51269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867424.51282: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867424.51338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867424.51401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867424.51423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867424.51454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867424.51542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 23826 1726867424.54396: stdout chunk (state=3): >>>ansible-tmp-1726867424.5053453-24234-141357066568955=/root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955 <<< 23826 1726867424.54510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867424.54836: stderr chunk (state=3): >>><<< 23826 1726867424.54840: stdout chunk (state=3): >>><<< 23826 1726867424.54842: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867424.5053453-24234-141357066568955=/root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 23826 1726867424.54845: variable 'ansible_module_compression' from source: unknown 23826 1726867424.54880: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 23826 1726867424.54982: ANSIBALLZ: Acquiring lock 23826 1726867424.54985: ANSIBALLZ: Lock acquired: 139851310993328 23826 1726867424.54988: ANSIBALLZ: Creating module 23826 1726867424.94016: ANSIBALLZ: Writing module into payload 23826 1726867424.94298: ANSIBALLZ: Writing module 23826 1726867424.94327: ANSIBALLZ: Renaming module 23826 1726867424.94351: ANSIBALLZ: Done creating module 23826 1726867424.94403: variable 'ansible_facts' from source: unknown 23826 1726867424.94545: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/AnsiballZ_dnf.py 23826 1726867424.95033: Sending initial data 23826 1726867424.95113: Sent initial data (152 bytes) 23826 1726867424.96296: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867424.96487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867424.96523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867424.98321: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867424.98356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867424.98431: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpwx1aovs_ /root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/AnsiballZ_dnf.py <<< 23826 1726867424.98435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/AnsiballZ_dnf.py" <<< 23826 1726867424.98662: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpwx1aovs_" to remote "/root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/AnsiballZ_dnf.py" <<< 23826 1726867425.00245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867425.00453: stderr chunk (state=3): >>><<< 23826 1726867425.00456: stdout chunk (state=3): >>><<< 23826 1726867425.00458: done transferring module to remote 23826 1726867425.00464: _low_level_execute_command(): starting 23826 1726867425.00467: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/ /root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/AnsiballZ_dnf.py && sleep 0' 23826 1726867425.01773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867425.01874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867425.02000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867425.02026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867425.02102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867425.04032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867425.04045: stdout chunk (state=3): >>><<< 23826 1726867425.04058: stderr chunk (state=3): >>><<< 23826 1726867425.04080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867425.04227: _low_level_execute_command(): starting 23826 1726867425.04231: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/AnsiballZ_dnf.py && sleep 0' 23826 1726867425.05591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867425.05620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867425.05704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867425.46624: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 23826 1726867425.50922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867425.50942: stdout chunk (state=3): >>><<< 23826 1726867425.50954: stderr chunk (state=3): >>><<< 23826 1726867425.50981: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867425.51373: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867425.51386: _low_level_execute_command(): starting 23826 1726867425.51389: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867424.5053453-24234-141357066568955/ > /dev/null 2>&1 && sleep 0' 23826 1726867425.52395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867425.52536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867425.52685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867425.52779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867425.54749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867425.54787: stderr chunk (state=3): >>><<< 23826 1726867425.54826: stdout chunk (state=3): >>><<< 23826 1726867425.54856: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867425.54864: handler run complete 23826 1726867425.55029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867425.55625: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867425.55665: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867425.55801: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867425.55831: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867425.56018: variable '__install_status' from source: unknown 23826 1726867425.56288: Evaluated conditional (__install_status is success): True 23826 1726867425.56291: attempt loop complete, returning result 23826 1726867425.56293: _execute() done 23826 1726867425.56295: dumping result to json 23826 1726867425.56298: done dumping result, returning 23826 1726867425.56300: done running TaskExecutor() for managed_node2/TASK: Install iproute [0affcac9-a3a5-a92d-a3ea-0000000001b1] 23826 1726867425.56302: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b1 23826 1726867425.56373: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b1 23826 1726867425.56379: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 23826 1726867425.56468: no more pending results, returning what we have 23826 1726867425.56471: results queue empty 23826 1726867425.56472: checking for any_errors_fatal 23826 1726867425.56476: done checking for any_errors_fatal 23826 1726867425.56479: checking for max_fail_percentage 23826 1726867425.56481: done checking for max_fail_percentage 23826 1726867425.56482: checking to see if all hosts have failed and the running result is not ok 23826 1726867425.56483: done checking to see if all hosts have failed 23826 1726867425.56484: getting the remaining hosts for this loop 23826 1726867425.56485: done getting the remaining hosts for this loop 23826 1726867425.56489: getting the next task for host managed_node2 23826 1726867425.56497: done getting next task for host managed_node2 23826 1726867425.56499: ^ task is: TASK: Create veth interface {{ interface }} 23826 1726867425.56502: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867425.56509: getting variables 23826 1726867425.56511: in VariableManager get_vars() 23826 1726867425.56548: Calling all_inventory to load vars for managed_node2 23826 1726867425.56550: Calling groups_inventory to load vars for managed_node2 23826 1726867425.56553: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867425.56564: Calling all_plugins_play to load vars for managed_node2 23826 1726867425.56567: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867425.56569: Calling groups_plugins_play to load vars for managed_node2 23826 1726867425.56845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867425.57237: done with get_vars() 23826 1726867425.57247: done getting variables 23826 1726867425.57443: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867425.57559: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 17:23:45 -0400 (0:00:01.161) 0:00:07.587 ****** 23826 1726867425.57601: entering _queue_task() for managed_node2/command 23826 1726867425.57871: worker is 1 (out of 1 available) 23826 1726867425.58083: exiting _queue_task() for managed_node2/command 23826 1726867425.58094: done queuing things up, now waiting for results queue to drain 23826 1726867425.58095: waiting for pending results... 23826 1726867425.58158: running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 23826 1726867425.58262: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b2 23826 1726867425.58294: variable 'ansible_search_path' from source: unknown 23826 1726867425.58321: variable 'ansible_search_path' from source: unknown 23826 1726867425.58576: variable 'interface' from source: set_fact 23826 1726867425.58666: variable 'interface' from source: set_fact 23826 1726867425.58758: variable 'interface' from source: set_fact 23826 1726867425.58895: Loaded config def from plugin (lookup/items) 23826 1726867425.58976: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 23826 1726867425.58981: variable 'omit' from source: magic vars 23826 1726867425.59056: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867425.59070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867425.59091: variable 'omit' from source: magic vars 23826 1726867425.59885: variable 'ansible_distribution_major_version' from source: facts 23826 1726867425.60425: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867425.60685: variable 'type' from source: set_fact 23826 1726867425.60694: variable 'state' from source: include params 23826 1726867425.60701: variable 'interface' from source: set_fact 23826 1726867425.60710: variable 'current_interfaces' from source: set_fact 23826 1726867425.60719: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 23826 1726867425.60728: variable 'omit' from source: magic vars 23826 1726867425.60768: variable 'omit' from source: magic vars 23826 1726867425.60816: variable 'item' from source: unknown 23826 1726867425.60891: variable 'item' from source: unknown 23826 1726867425.60914: variable 'omit' from source: magic vars 23826 1726867425.60946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867425.60989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867425.61015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867425.61036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867425.61051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867425.61083: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867425.61101: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867425.61213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867425.61216: Set connection var ansible_timeout to 10 23826 1726867425.61230: Set connection var ansible_shell_executable to /bin/sh 23826 1726867425.61236: Set connection var ansible_connection to ssh 23826 1726867425.61248: Set connection var ansible_pipelining to False 23826 1726867425.61257: Set connection var ansible_shell_type to sh 23826 1726867425.61269: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867425.61297: variable 'ansible_shell_executable' from source: unknown 23826 1726867425.61305: variable 'ansible_connection' from source: unknown 23826 1726867425.61322: variable 'ansible_module_compression' from source: unknown 23826 1726867425.61329: variable 'ansible_shell_type' from source: unknown 23826 1726867425.61335: variable 'ansible_shell_executable' from source: unknown 23826 1726867425.61341: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867425.61348: variable 'ansible_pipelining' from source: unknown 23826 1726867425.61354: variable 'ansible_timeout' from source: unknown 23826 1726867425.61361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867425.61504: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867425.61526: variable 'omit' from source: magic vars 23826 1726867425.61546: starting attempt loop 23826 1726867425.61555: running the handler 23826 1726867425.61580: _low_level_execute_command(): starting 23826 1726867425.61594: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867425.62343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867425.62529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867425.62643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867425.62740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867425.62764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867425.62851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867425.64606: stdout chunk (state=3): >>>/root <<< 23826 1726867425.64658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867425.64667: stdout chunk (state=3): >>><<< 23826 1726867425.64682: stderr chunk (state=3): >>><<< 23826 1726867425.64734: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867425.64874: _low_level_execute_command(): starting 23826 1726867425.64881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119 `" && echo ansible-tmp-1726867425.6478384-24275-96478891827119="` echo /root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119 `" ) && sleep 0' 23826 1726867425.66175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867425.66192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867425.66215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867425.66295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867425.68375: stdout chunk (state=3): >>>ansible-tmp-1726867425.6478384-24275-96478891827119=/root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119 <<< 23826 1726867425.68583: stdout chunk (state=3): >>><<< 23826 1726867425.68586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867425.68589: stderr chunk (state=3): >>><<< 23826 1726867425.68591: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867425.6478384-24275-96478891827119=/root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867425.68615: variable 'ansible_module_compression' from source: unknown 23826 1726867425.68751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867425.68928: variable 'ansible_facts' from source: unknown 23826 1726867425.69042: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/AnsiballZ_command.py 23826 1726867425.69419: Sending initial data 23826 1726867425.69422: Sent initial data (155 bytes) 23826 1726867425.70896: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867425.70899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867425.70901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867425.70929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867425.71003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867425.72618: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 23826 1726867425.72622: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867425.72810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867425.72855: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpmmm0qopr /root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/AnsiballZ_command.py <<< 23826 1726867425.72859: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/AnsiballZ_command.py" <<< 23826 1726867425.72884: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpmmm0qopr" to remote "/root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/AnsiballZ_command.py" <<< 23826 1726867425.74352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867425.74418: stderr chunk (state=3): >>><<< 23826 1726867425.74421: stdout chunk (state=3): >>><<< 23826 1726867425.74443: done transferring module to remote 23826 1726867425.74480: _low_level_execute_command(): starting 23826 1726867425.74485: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/ /root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/AnsiballZ_command.py && sleep 0' 23826 1726867425.75787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867425.76082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867425.76091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867425.76093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867425.77874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867425.77995: stderr chunk (state=3): >>><<< 23826 1726867425.77999: stdout chunk (state=3): >>><<< 23826 1726867425.78016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867425.78019: _low_level_execute_command(): starting 23826 1726867425.78024: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/AnsiballZ_command.py && sleep 0' 23826 1726867425.79319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867425.79402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867425.79411: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867425.79417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867425.79427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867425.79440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867425.79448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867425.79456: stderr chunk (state=3): >>>debug2: match found <<< 23826 1726867425.79463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867425.79534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867425.79649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867425.79694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867425.95815: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 17:23:45.949250", "end": "2024-09-20 17:23:45.955853", "delta": "0:00:00.006603", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867425.98343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867425.98347: stdout chunk (state=3): >>><<< 23826 1726867425.98352: stderr chunk (state=3): >>><<< 23826 1726867425.98381: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 17:23:45.949250", "end": "2024-09-20 17:23:45.955853", "delta": "0:00:00.006603", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867425.98524: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867425.98532: _low_level_execute_command(): starting 23826 1726867425.98537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867425.6478384-24275-96478891827119/ > /dev/null 2>&1 && sleep 0' 23826 1726867426.00053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867426.00161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.00290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.00338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.00714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.04240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.04244: stdout chunk (state=3): >>><<< 23826 1726867426.04247: stderr chunk (state=3): >>><<< 23826 1726867426.04364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.04368: handler run complete 23826 1726867426.04397: Evaluated conditional (False): False 23826 1726867426.04686: attempt loop complete, returning result 23826 1726867426.04690: variable 'item' from source: unknown 23826 1726867426.04711: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.006603", "end": "2024-09-20 17:23:45.955853", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 17:23:45.949250" } 23826 1726867426.05236: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.05239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.05242: variable 'omit' from source: magic vars 23826 1726867426.05671: variable 'ansible_distribution_major_version' from source: facts 23826 1726867426.05675: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867426.05873: variable 'type' from source: set_fact 23826 1726867426.06094: variable 'state' from source: include params 23826 1726867426.06098: variable 'interface' from source: set_fact 23826 1726867426.06100: variable 'current_interfaces' from source: set_fact 23826 1726867426.06102: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 23826 1726867426.06104: variable 'omit' from source: magic vars 23826 1726867426.06106: variable 'omit' from source: magic vars 23826 1726867426.06110: variable 'item' from source: unknown 23826 1726867426.06283: variable 'item' from source: unknown 23826 1726867426.06286: variable 'omit' from source: magic vars 23826 1726867426.06319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867426.06343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867426.06515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867426.06518: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867426.06521: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.06527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.06749: Set connection var ansible_timeout to 10 23826 1726867426.06752: Set connection var ansible_shell_executable to /bin/sh 23826 1726867426.06754: Set connection var ansible_connection to ssh 23826 1726867426.06756: Set connection var ansible_pipelining to False 23826 1726867426.06759: Set connection var ansible_shell_type to sh 23826 1726867426.06762: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867426.06764: variable 'ansible_shell_executable' from source: unknown 23826 1726867426.06766: variable 'ansible_connection' from source: unknown 23826 1726867426.06767: variable 'ansible_module_compression' from source: unknown 23826 1726867426.06769: variable 'ansible_shell_type' from source: unknown 23826 1726867426.06771: variable 'ansible_shell_executable' from source: unknown 23826 1726867426.06773: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.06774: variable 'ansible_pipelining' from source: unknown 23826 1726867426.06776: variable 'ansible_timeout' from source: unknown 23826 1726867426.06780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.07113: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867426.07116: variable 'omit' from source: magic vars 23826 1726867426.07119: starting attempt loop 23826 1726867426.07121: running the handler 23826 1726867426.07123: _low_level_execute_command(): starting 23826 1726867426.07126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867426.08448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867426.08541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.08719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.08792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.08954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.10710: stdout chunk (state=3): >>>/root <<< 23826 1726867426.10761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.10775: stdout chunk (state=3): >>><<< 23826 1726867426.10789: stderr chunk (state=3): >>><<< 23826 1726867426.10834: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.10925: _low_level_execute_command(): starting 23826 1726867426.10930: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889 `" && echo ansible-tmp-1726867426.1084461-24275-114479587613889="` echo /root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889 `" ) && sleep 0' 23826 1726867426.12212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867426.12215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.12217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.12230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.12292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.12378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.12402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.12481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.14445: stdout chunk (state=3): >>>ansible-tmp-1726867426.1084461-24275-114479587613889=/root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889 <<< 23826 1726867426.14557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.14666: stderr chunk (state=3): >>><<< 23826 1726867426.14674: stdout chunk (state=3): >>><<< 23826 1726867426.14806: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867426.1084461-24275-114479587613889=/root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.14809: variable 'ansible_module_compression' from source: unknown 23826 1726867426.14968: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867426.14972: variable 'ansible_facts' from source: unknown 23826 1726867426.15021: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/AnsiballZ_command.py 23826 1726867426.15306: Sending initial data 23826 1726867426.15310: Sent initial data (156 bytes) 23826 1726867426.16582: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.16629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.16649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.16681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.16809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.18570: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867426.18607: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867426.18642: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp7ynord14 /root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/AnsiballZ_command.py <<< 23826 1726867426.18650: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/AnsiballZ_command.py" <<< 23826 1726867426.18788: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp7ynord14" to remote "/root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/AnsiballZ_command.py" <<< 23826 1726867426.20069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.20219: stderr chunk (state=3): >>><<< 23826 1726867426.20222: stdout chunk (state=3): >>><<< 23826 1726867426.20250: done transferring module to remote 23826 1726867426.20258: _low_level_execute_command(): starting 23826 1726867426.20269: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/ /root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/AnsiballZ_command.py && sleep 0' 23826 1726867426.21483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867426.21490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.21595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.21613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.21718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.21757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.23794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.23797: stdout chunk (state=3): >>><<< 23826 1726867426.23799: stderr chunk (state=3): >>><<< 23826 1726867426.23802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.23804: _low_level_execute_command(): starting 23826 1726867426.23972: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/AnsiballZ_command.py && sleep 0' 23826 1726867426.25286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867426.25385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.25513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.25551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.25602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.41482: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 17:23:46.409476", "end": "2024-09-20 17:23:46.413403", "delta": "0:00:00.003927", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867426.43132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867426.43171: stdout chunk (state=3): >>><<< 23826 1726867426.43187: stderr chunk (state=3): >>><<< 23826 1726867426.43212: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 17:23:46.409476", "end": "2024-09-20 17:23:46.413403", "delta": "0:00:00.003927", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867426.43370: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867426.43374: _low_level_execute_command(): starting 23826 1726867426.43376: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867426.1084461-24275-114479587613889/ > /dev/null 2>&1 && sleep 0' 23826 1726867426.43896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867426.43903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.43916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.43969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867426.43972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867426.43975: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867426.43979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.43981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867426.43984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867426.44021: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867426.44087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.44090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.44092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867426.44094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867426.44096: stderr chunk (state=3): >>>debug2: match found <<< 23826 1726867426.44098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.44130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.44153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.44156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.44337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.46215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.46219: stdout chunk (state=3): >>><<< 23826 1726867426.46252: stderr chunk (state=3): >>><<< 23826 1726867426.46426: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.46431: handler run complete 23826 1726867426.46433: Evaluated conditional (False): False 23826 1726867426.46438: attempt loop complete, returning result 23826 1726867426.46481: variable 'item' from source: unknown 23826 1726867426.46563: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003927", "end": "2024-09-20 17:23:46.413403", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 17:23:46.409476" } 23826 1726867426.46765: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.46768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.46771: variable 'omit' from source: magic vars 23826 1726867426.47291: variable 'ansible_distribution_major_version' from source: facts 23826 1726867426.47295: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867426.47550: variable 'type' from source: set_fact 23826 1726867426.47553: variable 'state' from source: include params 23826 1726867426.47556: variable 'interface' from source: set_fact 23826 1726867426.47558: variable 'current_interfaces' from source: set_fact 23826 1726867426.47561: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 23826 1726867426.47582: variable 'omit' from source: magic vars 23826 1726867426.47592: variable 'omit' from source: magic vars 23826 1726867426.47631: variable 'item' from source: unknown 23826 1726867426.47933: variable 'item' from source: unknown 23826 1726867426.47948: variable 'omit' from source: magic vars 23826 1726867426.47986: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867426.47990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867426.48193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867426.48246: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867426.48253: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.48261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.48333: Set connection var ansible_timeout to 10 23826 1726867426.48336: Set connection var ansible_shell_executable to /bin/sh 23826 1726867426.48338: Set connection var ansible_connection to ssh 23826 1726867426.48461: Set connection var ansible_pipelining to False 23826 1726867426.48467: Set connection var ansible_shell_type to sh 23826 1726867426.48470: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867426.48472: variable 'ansible_shell_executable' from source: unknown 23826 1726867426.48475: variable 'ansible_connection' from source: unknown 23826 1726867426.48478: variable 'ansible_module_compression' from source: unknown 23826 1726867426.48480: variable 'ansible_shell_type' from source: unknown 23826 1726867426.48482: variable 'ansible_shell_executable' from source: unknown 23826 1726867426.48484: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.48486: variable 'ansible_pipelining' from source: unknown 23826 1726867426.48488: variable 'ansible_timeout' from source: unknown 23826 1726867426.48490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.48720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867426.48723: variable 'omit' from source: magic vars 23826 1726867426.48725: starting attempt loop 23826 1726867426.48727: running the handler 23826 1726867426.48729: _low_level_execute_command(): starting 23826 1726867426.48731: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867426.49893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.49906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.49984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867426.49988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867426.49991: stderr chunk (state=3): >>>debug2: match found <<< 23826 1726867426.49993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.50184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.50188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.50222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.51904: stdout chunk (state=3): >>>/root <<< 23826 1726867426.52005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.52092: stderr chunk (state=3): >>><<< 23826 1726867426.52095: stdout chunk (state=3): >>><<< 23826 1726867426.52183: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.52279: _low_level_execute_command(): starting 23826 1726867426.52286: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873 `" && echo ansible-tmp-1726867426.5218165-24275-156837610875873="` echo /root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873 `" ) && sleep 0' 23826 1726867426.53398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.53434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.53444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867426.53457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867426.53592: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.53659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.53676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.53762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.53844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.55843: stdout chunk (state=3): >>>ansible-tmp-1726867426.5218165-24275-156837610875873=/root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873 <<< 23826 1726867426.56183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.56186: stdout chunk (state=3): >>><<< 23826 1726867426.56252: stderr chunk (state=3): >>><<< 23826 1726867426.56255: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867426.5218165-24275-156837610875873=/root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.56257: variable 'ansible_module_compression' from source: unknown 23826 1726867426.56269: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867426.56295: variable 'ansible_facts' from source: unknown 23826 1726867426.56366: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/AnsiballZ_command.py 23826 1726867426.56826: Sending initial data 23826 1726867426.56829: Sent initial data (156 bytes) 23826 1726867426.58476: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.58583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867426.58587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867426.58589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.58595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.58598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867426.58600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.58797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.58833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.58848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.59026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.60648: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867426.60890: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867426.60893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/AnsiballZ_command.py" <<< 23826 1726867426.60896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpav9etc1n /root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/AnsiballZ_command.py <<< 23826 1726867426.60898: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpav9etc1n" to remote "/root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/AnsiballZ_command.py" <<< 23826 1726867426.62612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.62628: stderr chunk (state=3): >>><<< 23826 1726867426.62632: stdout chunk (state=3): >>><<< 23826 1726867426.62674: done transferring module to remote 23826 1726867426.62685: _low_level_execute_command(): starting 23826 1726867426.62690: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/ /root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/AnsiballZ_command.py && sleep 0' 23826 1726867426.64222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.64302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.64305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867426.64399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.64413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.64650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.64653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.64727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.66773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.66776: stdout chunk (state=3): >>><<< 23826 1726867426.66780: stderr chunk (state=3): >>><<< 23826 1726867426.66858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.66861: _low_level_execute_command(): starting 23826 1726867426.66864: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/AnsiballZ_command.py && sleep 0' 23826 1726867426.67674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867426.67688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.67702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.67712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867426.67725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867426.67732: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867426.67742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.67790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.67830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.67880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867426.67886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.67931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.83696: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 17:23:46.831021", "end": "2024-09-20 17:23:46.834891", "delta": "0:00:00.003870", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 23826 1726867426.83701: stdout chunk (state=3): >>> <<< 23826 1726867426.85240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867426.85244: stdout chunk (state=3): >>><<< 23826 1726867426.85246: stderr chunk (state=3): >>><<< 23826 1726867426.85265: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 17:23:46.831021", "end": "2024-09-20 17:23:46.834891", "delta": "0:00:00.003870", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867426.85292: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867426.85295: _low_level_execute_command(): starting 23826 1726867426.85301: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867426.5218165-24275-156837610875873/ > /dev/null 2>&1 && sleep 0' 23826 1726867426.85915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.85933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.87737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.87757: stderr chunk (state=3): >>><<< 23826 1726867426.87760: stdout chunk (state=3): >>><<< 23826 1726867426.87771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.87776: handler run complete 23826 1726867426.87792: Evaluated conditional (False): False 23826 1726867426.87799: attempt loop complete, returning result 23826 1726867426.87814: variable 'item' from source: unknown 23826 1726867426.87875: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003870", "end": "2024-09-20 17:23:46.834891", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 17:23:46.831021" } 23826 1726867426.87990: dumping result to json 23826 1726867426.87993: done dumping result, returning 23826 1726867426.87995: done running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 [0affcac9-a3a5-a92d-a3ea-0000000001b2] 23826 1726867426.87996: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b2 23826 1726867426.88167: no more pending results, returning what we have 23826 1726867426.88170: results queue empty 23826 1726867426.88171: checking for any_errors_fatal 23826 1726867426.88174: done checking for any_errors_fatal 23826 1726867426.88175: checking for max_fail_percentage 23826 1726867426.88183: done checking for max_fail_percentage 23826 1726867426.88184: checking to see if all hosts have failed and the running result is not ok 23826 1726867426.88185: done checking to see if all hosts have failed 23826 1726867426.88187: getting the remaining hosts for this loop 23826 1726867426.88195: done getting the remaining hosts for this loop 23826 1726867426.88198: getting the next task for host managed_node2 23826 1726867426.88202: done getting next task for host managed_node2 23826 1726867426.88204: ^ task is: TASK: Set up veth as managed by NetworkManager 23826 1726867426.88209: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867426.88213: getting variables 23826 1726867426.88214: in VariableManager get_vars() 23826 1726867426.88239: Calling all_inventory to load vars for managed_node2 23826 1726867426.88242: Calling groups_inventory to load vars for managed_node2 23826 1726867426.88244: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867426.88252: Calling all_plugins_play to load vars for managed_node2 23826 1726867426.88254: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867426.88257: Calling groups_plugins_play to load vars for managed_node2 23826 1726867426.88361: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b2 23826 1726867426.88365: WORKER PROCESS EXITING 23826 1726867426.88375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867426.88493: done with get_vars() 23826 1726867426.88500: done getting variables 23826 1726867426.88544: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 17:23:46 -0400 (0:00:01.309) 0:00:08.897 ****** 23826 1726867426.88565: entering _queue_task() for managed_node2/command 23826 1726867426.88753: worker is 1 (out of 1 available) 23826 1726867426.88764: exiting _queue_task() for managed_node2/command 23826 1726867426.88774: done queuing things up, now waiting for results queue to drain 23826 1726867426.88776: waiting for pending results... 23826 1726867426.88931: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 23826 1726867426.88987: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b3 23826 1726867426.88999: variable 'ansible_search_path' from source: unknown 23826 1726867426.89004: variable 'ansible_search_path' from source: unknown 23826 1726867426.89034: calling self._execute() 23826 1726867426.89095: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.89099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.89107: variable 'omit' from source: magic vars 23826 1726867426.89368: variable 'ansible_distribution_major_version' from source: facts 23826 1726867426.89378: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867426.89482: variable 'type' from source: set_fact 23826 1726867426.89488: variable 'state' from source: include params 23826 1726867426.89493: Evaluated conditional (type == 'veth' and state == 'present'): True 23826 1726867426.89499: variable 'omit' from source: magic vars 23826 1726867426.89526: variable 'omit' from source: magic vars 23826 1726867426.89598: variable 'interface' from source: set_fact 23826 1726867426.89613: variable 'omit' from source: magic vars 23826 1726867426.89644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867426.89672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867426.89690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867426.89703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867426.89715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867426.89744: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867426.89746: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.89749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.89822: Set connection var ansible_timeout to 10 23826 1726867426.89829: Set connection var ansible_shell_executable to /bin/sh 23826 1726867426.89831: Set connection var ansible_connection to ssh 23826 1726867426.89837: Set connection var ansible_pipelining to False 23826 1726867426.89840: Set connection var ansible_shell_type to sh 23826 1726867426.89845: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867426.89862: variable 'ansible_shell_executable' from source: unknown 23826 1726867426.89865: variable 'ansible_connection' from source: unknown 23826 1726867426.89867: variable 'ansible_module_compression' from source: unknown 23826 1726867426.89869: variable 'ansible_shell_type' from source: unknown 23826 1726867426.89871: variable 'ansible_shell_executable' from source: unknown 23826 1726867426.89873: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867426.89882: variable 'ansible_pipelining' from source: unknown 23826 1726867426.89886: variable 'ansible_timeout' from source: unknown 23826 1726867426.89888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867426.89987: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867426.90007: variable 'omit' from source: magic vars 23826 1726867426.90010: starting attempt loop 23826 1726867426.90013: running the handler 23826 1726867426.90019: _low_level_execute_command(): starting 23826 1726867426.90027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867426.90503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.90510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.90514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867426.90517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.90560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.90610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.92233: stdout chunk (state=3): >>>/root <<< 23826 1726867426.92331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.92353: stderr chunk (state=3): >>><<< 23826 1726867426.92356: stdout chunk (state=3): >>><<< 23826 1726867426.92382: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.92392: _low_level_execute_command(): starting 23826 1726867426.92396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632 `" && echo ansible-tmp-1726867426.923801-24342-179089596814632="` echo /root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632 `" ) && sleep 0' 23826 1726867426.92802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867426.92806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.92814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867426.92817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867426.92819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.92862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.92865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.92910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.94804: stdout chunk (state=3): >>>ansible-tmp-1726867426.923801-24342-179089596814632=/root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632 <<< 23826 1726867426.94916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.94937: stderr chunk (state=3): >>><<< 23826 1726867426.94940: stdout chunk (state=3): >>><<< 23826 1726867426.94951: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867426.923801-24342-179089596814632=/root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867426.94981: variable 'ansible_module_compression' from source: unknown 23826 1726867426.95018: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867426.95045: variable 'ansible_facts' from source: unknown 23826 1726867426.95103: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/AnsiballZ_command.py 23826 1726867426.95194: Sending initial data 23826 1726867426.95198: Sent initial data (155 bytes) 23826 1726867426.95614: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.95617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867426.95619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867426.95621: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867426.95623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.95674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.95679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.95723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867426.97285: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 23826 1726867426.97288: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867426.97320: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867426.97358: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpnh0r6u2z /root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/AnsiballZ_command.py <<< 23826 1726867426.97368: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/AnsiballZ_command.py" <<< 23826 1726867426.97395: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpnh0r6u2z" to remote "/root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/AnsiballZ_command.py" <<< 23826 1726867426.97922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867426.97952: stderr chunk (state=3): >>><<< 23826 1726867426.97955: stdout chunk (state=3): >>><<< 23826 1726867426.97974: done transferring module to remote 23826 1726867426.97983: _low_level_execute_command(): starting 23826 1726867426.97986: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/ /root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/AnsiballZ_command.py && sleep 0' 23826 1726867426.98385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.98389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.98391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867426.98396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867426.98398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867426.98441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867426.98444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867426.98489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.00248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867427.00268: stderr chunk (state=3): >>><<< 23826 1726867427.00273: stdout chunk (state=3): >>><<< 23826 1726867427.00287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867427.00290: _low_level_execute_command(): starting 23826 1726867427.00293: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/AnsiballZ_command.py && sleep 0' 23826 1726867427.00686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867427.00690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.00693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867427.00704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.00741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867427.00757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867427.00805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.18413: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 17:23:47.163277", "end": "2024-09-20 17:23:47.181615", "delta": "0:00:00.018338", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867427.20449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867427.20454: stdout chunk (state=3): >>><<< 23826 1726867427.20457: stderr chunk (state=3): >>><<< 23826 1726867427.20459: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 17:23:47.163277", "end": "2024-09-20 17:23:47.181615", "delta": "0:00:00.018338", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867427.20462: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867427.20465: _low_level_execute_command(): starting 23826 1726867427.20467: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867426.923801-24342-179089596814632/ > /dev/null 2>&1 && sleep 0' 23826 1726867427.21694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.21849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867427.21853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867427.22211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867427.22294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.24195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867427.24204: stdout chunk (state=3): >>><<< 23826 1726867427.24217: stderr chunk (state=3): >>><<< 23826 1726867427.24237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867427.24248: handler run complete 23826 1726867427.24484: Evaluated conditional (False): False 23826 1726867427.24487: attempt loop complete, returning result 23826 1726867427.24490: _execute() done 23826 1726867427.24492: dumping result to json 23826 1726867427.24494: done dumping result, returning 23826 1726867427.24496: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0affcac9-a3a5-a92d-a3ea-0000000001b3] 23826 1726867427.24498: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b3 23826 1726867427.24572: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b3 23826 1726867427.24575: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.018338", "end": "2024-09-20 17:23:47.181615", "rc": 0, "start": "2024-09-20 17:23:47.163277" } 23826 1726867427.24657: no more pending results, returning what we have 23826 1726867427.24661: results queue empty 23826 1726867427.24662: checking for any_errors_fatal 23826 1726867427.24801: done checking for any_errors_fatal 23826 1726867427.24802: checking for max_fail_percentage 23826 1726867427.24805: done checking for max_fail_percentage 23826 1726867427.24806: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.24809: done checking to see if all hosts have failed 23826 1726867427.24810: getting the remaining hosts for this loop 23826 1726867427.24812: done getting the remaining hosts for this loop 23826 1726867427.24816: getting the next task for host managed_node2 23826 1726867427.24823: done getting next task for host managed_node2 23826 1726867427.24826: ^ task is: TASK: Delete veth interface {{ interface }} 23826 1726867427.24829: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.24834: getting variables 23826 1726867427.24836: in VariableManager get_vars() 23826 1726867427.24874: Calling all_inventory to load vars for managed_node2 23826 1726867427.25023: Calling groups_inventory to load vars for managed_node2 23826 1726867427.25027: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.25038: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.25042: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.25045: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.25626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.26180: done with get_vars() 23826 1726867427.26190: done getting variables 23826 1726867427.26373: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867427.26689: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 17:23:47 -0400 (0:00:00.381) 0:00:09.278 ****** 23826 1726867427.26723: entering _queue_task() for managed_node2/command 23826 1726867427.28029: worker is 1 (out of 1 available) 23826 1726867427.28039: exiting _queue_task() for managed_node2/command 23826 1726867427.28051: done queuing things up, now waiting for results queue to drain 23826 1726867427.28054: waiting for pending results... 23826 1726867427.28723: running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 23826 1726867427.28728: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b4 23826 1726867427.28731: variable 'ansible_search_path' from source: unknown 23826 1726867427.28733: variable 'ansible_search_path' from source: unknown 23826 1726867427.28735: calling self._execute() 23826 1726867427.29083: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.29086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.29089: variable 'omit' from source: magic vars 23826 1726867427.30284: variable 'ansible_distribution_major_version' from source: facts 23826 1726867427.30288: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867427.31182: variable 'type' from source: set_fact 23826 1726867427.31185: variable 'state' from source: include params 23826 1726867427.31188: variable 'interface' from source: set_fact 23826 1726867427.31190: variable 'current_interfaces' from source: set_fact 23826 1726867427.31192: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 23826 1726867427.31194: when evaluation is False, skipping this task 23826 1726867427.31196: _execute() done 23826 1726867427.31198: dumping result to json 23826 1726867427.31201: done dumping result, returning 23826 1726867427.31203: done running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 [0affcac9-a3a5-a92d-a3ea-0000000001b4] 23826 1726867427.31204: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b4 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 23826 1726867427.31431: no more pending results, returning what we have 23826 1726867427.31435: results queue empty 23826 1726867427.31436: checking for any_errors_fatal 23826 1726867427.31443: done checking for any_errors_fatal 23826 1726867427.31444: checking for max_fail_percentage 23826 1726867427.31446: done checking for max_fail_percentage 23826 1726867427.31446: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.31448: done checking to see if all hosts have failed 23826 1726867427.31448: getting the remaining hosts for this loop 23826 1726867427.31450: done getting the remaining hosts for this loop 23826 1726867427.31453: getting the next task for host managed_node2 23826 1726867427.31460: done getting next task for host managed_node2 23826 1726867427.31462: ^ task is: TASK: Create dummy interface {{ interface }} 23826 1726867427.31466: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.31470: getting variables 23826 1726867427.31471: in VariableManager get_vars() 23826 1726867427.31515: Calling all_inventory to load vars for managed_node2 23826 1726867427.31518: Calling groups_inventory to load vars for managed_node2 23826 1726867427.31520: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.31532: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.31535: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.31538: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.32008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.32207: done with get_vars() 23826 1726867427.32217: done getting variables 23826 1726867427.32271: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867427.32381: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 17:23:47 -0400 (0:00:00.056) 0:00:09.335 ****** 23826 1726867427.32412: entering _queue_task() for managed_node2/command 23826 1726867427.32430: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b4 23826 1726867427.32433: WORKER PROCESS EXITING 23826 1726867427.32827: worker is 1 (out of 1 available) 23826 1726867427.32839: exiting _queue_task() for managed_node2/command 23826 1726867427.32850: done queuing things up, now waiting for results queue to drain 23826 1726867427.32851: waiting for pending results... 23826 1726867427.33230: running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 23826 1726867427.33332: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b5 23826 1726867427.33354: variable 'ansible_search_path' from source: unknown 23826 1726867427.33361: variable 'ansible_search_path' from source: unknown 23826 1726867427.33406: calling self._execute() 23826 1726867427.33500: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.33514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.33527: variable 'omit' from source: magic vars 23826 1726867427.34291: variable 'ansible_distribution_major_version' from source: facts 23826 1726867427.34415: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867427.34884: variable 'type' from source: set_fact 23826 1726867427.34887: variable 'state' from source: include params 23826 1726867427.34890: variable 'interface' from source: set_fact 23826 1726867427.34892: variable 'current_interfaces' from source: set_fact 23826 1726867427.34894: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 23826 1726867427.34896: when evaluation is False, skipping this task 23826 1726867427.34898: _execute() done 23826 1726867427.34900: dumping result to json 23826 1726867427.34902: done dumping result, returning 23826 1726867427.34904: done running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 [0affcac9-a3a5-a92d-a3ea-0000000001b5] 23826 1726867427.34906: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b5 23826 1726867427.34964: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b5 23826 1726867427.34966: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 23826 1726867427.35038: no more pending results, returning what we have 23826 1726867427.35041: results queue empty 23826 1726867427.35042: checking for any_errors_fatal 23826 1726867427.35047: done checking for any_errors_fatal 23826 1726867427.35048: checking for max_fail_percentage 23826 1726867427.35049: done checking for max_fail_percentage 23826 1726867427.35050: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.35051: done checking to see if all hosts have failed 23826 1726867427.35051: getting the remaining hosts for this loop 23826 1726867427.35053: done getting the remaining hosts for this loop 23826 1726867427.35056: getting the next task for host managed_node2 23826 1726867427.35061: done getting next task for host managed_node2 23826 1726867427.35063: ^ task is: TASK: Delete dummy interface {{ interface }} 23826 1726867427.35066: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.35074: getting variables 23826 1726867427.35075: in VariableManager get_vars() 23826 1726867427.35107: Calling all_inventory to load vars for managed_node2 23826 1726867427.35110: Calling groups_inventory to load vars for managed_node2 23826 1726867427.35112: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.35125: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.35128: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.35132: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.35726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.36420: done with get_vars() 23826 1726867427.36430: done getting variables 23826 1726867427.36502: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867427.36807: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 17:23:47 -0400 (0:00:00.044) 0:00:09.379 ****** 23826 1726867427.36835: entering _queue_task() for managed_node2/command 23826 1726867427.37376: worker is 1 (out of 1 available) 23826 1726867427.37390: exiting _queue_task() for managed_node2/command 23826 1726867427.37403: done queuing things up, now waiting for results queue to drain 23826 1726867427.37404: waiting for pending results... 23826 1726867427.38022: running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 23826 1726867427.38254: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b6 23826 1726867427.38279: variable 'ansible_search_path' from source: unknown 23826 1726867427.38286: variable 'ansible_search_path' from source: unknown 23826 1726867427.38331: calling self._execute() 23826 1726867427.38453: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.38638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.38700: variable 'omit' from source: magic vars 23826 1726867427.39984: variable 'ansible_distribution_major_version' from source: facts 23826 1726867427.39988: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867427.40390: variable 'type' from source: set_fact 23826 1726867427.40401: variable 'state' from source: include params 23826 1726867427.40433: variable 'interface' from source: set_fact 23826 1726867427.40481: variable 'current_interfaces' from source: set_fact 23826 1726867427.40529: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 23826 1726867427.40538: when evaluation is False, skipping this task 23826 1726867427.40546: _execute() done 23826 1726867427.40553: dumping result to json 23826 1726867427.40561: done dumping result, returning 23826 1726867427.40570: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 [0affcac9-a3a5-a92d-a3ea-0000000001b6] 23826 1726867427.40582: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b6 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 23826 1726867427.40718: no more pending results, returning what we have 23826 1726867427.40722: results queue empty 23826 1726867427.40723: checking for any_errors_fatal 23826 1726867427.40731: done checking for any_errors_fatal 23826 1726867427.40732: checking for max_fail_percentage 23826 1726867427.40734: done checking for max_fail_percentage 23826 1726867427.40735: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.40736: done checking to see if all hosts have failed 23826 1726867427.40736: getting the remaining hosts for this loop 23826 1726867427.40738: done getting the remaining hosts for this loop 23826 1726867427.40742: getting the next task for host managed_node2 23826 1726867427.40749: done getting next task for host managed_node2 23826 1726867427.40751: ^ task is: TASK: Create tap interface {{ interface }} 23826 1726867427.40754: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.40758: getting variables 23826 1726867427.40759: in VariableManager get_vars() 23826 1726867427.40799: Calling all_inventory to load vars for managed_node2 23826 1726867427.40802: Calling groups_inventory to load vars for managed_node2 23826 1726867427.40804: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.40817: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.40821: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.40824: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.41204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.41717: done with get_vars() 23826 1726867427.41727: done getting variables 23826 1726867427.41789: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867427.41892: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 17:23:47 -0400 (0:00:00.050) 0:00:09.430 ****** 23826 1726867427.41922: entering _queue_task() for managed_node2/command 23826 1726867427.42188: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b6 23826 1726867427.42191: WORKER PROCESS EXITING 23826 1726867427.42353: worker is 1 (out of 1 available) 23826 1726867427.42365: exiting _queue_task() for managed_node2/command 23826 1726867427.42780: done queuing things up, now waiting for results queue to drain 23826 1726867427.42782: waiting for pending results... 23826 1726867427.42955: running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 23826 1726867427.43050: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b7 23826 1726867427.43133: variable 'ansible_search_path' from source: unknown 23826 1726867427.43140: variable 'ansible_search_path' from source: unknown 23826 1726867427.43180: calling self._execute() 23826 1726867427.43305: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.43452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.43468: variable 'omit' from source: magic vars 23826 1726867427.44090: variable 'ansible_distribution_major_version' from source: facts 23826 1726867427.44121: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867427.44481: variable 'type' from source: set_fact 23826 1726867427.44593: variable 'state' from source: include params 23826 1726867427.44608: variable 'interface' from source: set_fact 23826 1726867427.44617: variable 'current_interfaces' from source: set_fact 23826 1726867427.44629: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 23826 1726867427.44636: when evaluation is False, skipping this task 23826 1726867427.44642: _execute() done 23826 1726867427.44647: dumping result to json 23826 1726867427.44655: done dumping result, returning 23826 1726867427.44664: done running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 [0affcac9-a3a5-a92d-a3ea-0000000001b7] 23826 1726867427.44673: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b7 23826 1726867427.44813: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b7 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 23826 1726867427.44862: no more pending results, returning what we have 23826 1726867427.44866: results queue empty 23826 1726867427.44867: checking for any_errors_fatal 23826 1726867427.44873: done checking for any_errors_fatal 23826 1726867427.44874: checking for max_fail_percentage 23826 1726867427.44875: done checking for max_fail_percentage 23826 1726867427.44876: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.44879: done checking to see if all hosts have failed 23826 1726867427.44880: getting the remaining hosts for this loop 23826 1726867427.44881: done getting the remaining hosts for this loop 23826 1726867427.44885: getting the next task for host managed_node2 23826 1726867427.44892: done getting next task for host managed_node2 23826 1726867427.44895: ^ task is: TASK: Delete tap interface {{ interface }} 23826 1726867427.44898: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.44902: getting variables 23826 1726867427.44903: in VariableManager get_vars() 23826 1726867427.44939: Calling all_inventory to load vars for managed_node2 23826 1726867427.44942: Calling groups_inventory to load vars for managed_node2 23826 1726867427.44944: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.44957: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.44960: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.44963: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.45684: WORKER PROCESS EXITING 23826 1726867427.45706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.46005: done with get_vars() 23826 1726867427.46014: done getting variables 23826 1726867427.46067: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867427.46166: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 17:23:47 -0400 (0:00:00.045) 0:00:09.476 ****** 23826 1726867427.46498: entering _queue_task() for managed_node2/command 23826 1726867427.46821: worker is 1 (out of 1 available) 23826 1726867427.46832: exiting _queue_task() for managed_node2/command 23826 1726867427.46844: done queuing things up, now waiting for results queue to drain 23826 1726867427.46845: waiting for pending results... 23826 1726867427.47154: running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 23826 1726867427.47253: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000001b8 23826 1726867427.47333: variable 'ansible_search_path' from source: unknown 23826 1726867427.47342: variable 'ansible_search_path' from source: unknown 23826 1726867427.47383: calling self._execute() 23826 1726867427.47505: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.47509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.47512: variable 'omit' from source: magic vars 23826 1726867427.47935: variable 'ansible_distribution_major_version' from source: facts 23826 1726867427.47960: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867427.48269: variable 'type' from source: set_fact 23826 1726867427.48272: variable 'state' from source: include params 23826 1726867427.48274: variable 'interface' from source: set_fact 23826 1726867427.48278: variable 'current_interfaces' from source: set_fact 23826 1726867427.48282: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 23826 1726867427.48284: when evaluation is False, skipping this task 23826 1726867427.48287: _execute() done 23826 1726867427.48289: dumping result to json 23826 1726867427.48291: done dumping result, returning 23826 1726867427.48294: done running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 [0affcac9-a3a5-a92d-a3ea-0000000001b8] 23826 1726867427.48296: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b8 23826 1726867427.48355: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000001b8 23826 1726867427.48359: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 23826 1726867427.48407: no more pending results, returning what we have 23826 1726867427.48411: results queue empty 23826 1726867427.48412: checking for any_errors_fatal 23826 1726867427.48420: done checking for any_errors_fatal 23826 1726867427.48420: checking for max_fail_percentage 23826 1726867427.48422: done checking for max_fail_percentage 23826 1726867427.48423: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.48424: done checking to see if all hosts have failed 23826 1726867427.48425: getting the remaining hosts for this loop 23826 1726867427.48426: done getting the remaining hosts for this loop 23826 1726867427.48430: getting the next task for host managed_node2 23826 1726867427.48438: done getting next task for host managed_node2 23826 1726867427.48442: ^ task is: TASK: Include the task 'assert_device_present.yml' 23826 1726867427.48444: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.48450: getting variables 23826 1726867427.48451: in VariableManager get_vars() 23826 1726867427.48492: Calling all_inventory to load vars for managed_node2 23826 1726867427.48495: Calling groups_inventory to load vars for managed_node2 23826 1726867427.48498: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.48511: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.48515: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.48518: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.48875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.49066: done with get_vars() 23826 1726867427.49107: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:20 Friday 20 September 2024 17:23:47 -0400 (0:00:00.027) 0:00:09.503 ****** 23826 1726867427.49203: entering _queue_task() for managed_node2/include_tasks 23826 1726867427.49960: worker is 1 (out of 1 available) 23826 1726867427.49971: exiting _queue_task() for managed_node2/include_tasks 23826 1726867427.49983: done queuing things up, now waiting for results queue to drain 23826 1726867427.49984: waiting for pending results... 23826 1726867427.50296: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 23826 1726867427.50384: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000000e 23826 1726867427.50387: variable 'ansible_search_path' from source: unknown 23826 1726867427.50390: calling self._execute() 23826 1726867427.50469: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.50483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.50496: variable 'omit' from source: magic vars 23826 1726867427.50861: variable 'ansible_distribution_major_version' from source: facts 23826 1726867427.50876: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867427.50947: _execute() done 23826 1726867427.50952: dumping result to json 23826 1726867427.50955: done dumping result, returning 23826 1726867427.50957: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [0affcac9-a3a5-a92d-a3ea-00000000000e] 23826 1726867427.50960: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000e 23826 1726867427.51024: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000e 23826 1726867427.51027: WORKER PROCESS EXITING 23826 1726867427.51053: no more pending results, returning what we have 23826 1726867427.51057: in VariableManager get_vars() 23826 1726867427.51099: Calling all_inventory to load vars for managed_node2 23826 1726867427.51102: Calling groups_inventory to load vars for managed_node2 23826 1726867427.51104: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.51119: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.51121: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.51124: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.51725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.51931: done with get_vars() 23826 1726867427.51938: variable 'ansible_search_path' from source: unknown 23826 1726867427.51949: we have included files to process 23826 1726867427.51950: generating all_blocks data 23826 1726867427.51952: done generating all_blocks data 23826 1726867427.51954: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 23826 1726867427.51955: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 23826 1726867427.51957: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 23826 1726867427.52104: in VariableManager get_vars() 23826 1726867427.52123: done with get_vars() 23826 1726867427.52233: done processing included file 23826 1726867427.52235: iterating over new_blocks loaded from include file 23826 1726867427.52236: in VariableManager get_vars() 23826 1726867427.52256: done with get_vars() 23826 1726867427.52258: filtering new block on tags 23826 1726867427.52274: done filtering new block on tags 23826 1726867427.52276: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 23826 1726867427.52282: extending task lists for all hosts with included blocks 23826 1726867427.54825: done extending task lists 23826 1726867427.54826: done processing included files 23826 1726867427.54827: results queue empty 23826 1726867427.54828: checking for any_errors_fatal 23826 1726867427.54830: done checking for any_errors_fatal 23826 1726867427.54831: checking for max_fail_percentage 23826 1726867427.54832: done checking for max_fail_percentage 23826 1726867427.54832: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.54833: done checking to see if all hosts have failed 23826 1726867427.54834: getting the remaining hosts for this loop 23826 1726867427.54835: done getting the remaining hosts for this loop 23826 1726867427.54837: getting the next task for host managed_node2 23826 1726867427.54841: done getting next task for host managed_node2 23826 1726867427.54843: ^ task is: TASK: Include the task 'get_interface_stat.yml' 23826 1726867427.54845: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.54847: getting variables 23826 1726867427.54848: in VariableManager get_vars() 23826 1726867427.54868: Calling all_inventory to load vars for managed_node2 23826 1726867427.54870: Calling groups_inventory to load vars for managed_node2 23826 1726867427.54872: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.54879: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.54881: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.54884: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.55031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.55254: done with get_vars() 23826 1726867427.55262: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:23:47 -0400 (0:00:00.061) 0:00:09.565 ****** 23826 1726867427.55366: entering _queue_task() for managed_node2/include_tasks 23826 1726867427.55692: worker is 1 (out of 1 available) 23826 1726867427.55703: exiting _queue_task() for managed_node2/include_tasks 23826 1726867427.55715: done queuing things up, now waiting for results queue to drain 23826 1726867427.55717: waiting for pending results... 23826 1726867427.55983: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 23826 1726867427.56187: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000002bc 23826 1726867427.56191: variable 'ansible_search_path' from source: unknown 23826 1726867427.56193: variable 'ansible_search_path' from source: unknown 23826 1726867427.56195: calling self._execute() 23826 1726867427.56226: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.56235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.56246: variable 'omit' from source: magic vars 23826 1726867427.56664: variable 'ansible_distribution_major_version' from source: facts 23826 1726867427.56683: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867427.56694: _execute() done 23826 1726867427.56702: dumping result to json 23826 1726867427.56713: done dumping result, returning 23826 1726867427.56732: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-a92d-a3ea-0000000002bc] 23826 1726867427.56746: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000002bc 23826 1726867427.56910: no more pending results, returning what we have 23826 1726867427.56916: in VariableManager get_vars() 23826 1726867427.56967: Calling all_inventory to load vars for managed_node2 23826 1726867427.56970: Calling groups_inventory to load vars for managed_node2 23826 1726867427.56973: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.56988: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.56991: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.56994: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.57391: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000002bc 23826 1726867427.57426: WORKER PROCESS EXITING 23826 1726867427.57451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.57691: done with get_vars() 23826 1726867427.57699: variable 'ansible_search_path' from source: unknown 23826 1726867427.57700: variable 'ansible_search_path' from source: unknown 23826 1726867427.57803: we have included files to process 23826 1726867427.57804: generating all_blocks data 23826 1726867427.57806: done generating all_blocks data 23826 1726867427.57851: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 23826 1726867427.57852: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 23826 1726867427.57855: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 23826 1726867427.58465: done processing included file 23826 1726867427.58467: iterating over new_blocks loaded from include file 23826 1726867427.58468: in VariableManager get_vars() 23826 1726867427.58600: done with get_vars() 23826 1726867427.58604: filtering new block on tags 23826 1726867427.58623: done filtering new block on tags 23826 1726867427.58626: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 23826 1726867427.58630: extending task lists for all hosts with included blocks 23826 1726867427.58834: done extending task lists 23826 1726867427.58836: done processing included files 23826 1726867427.58836: results queue empty 23826 1726867427.58837: checking for any_errors_fatal 23826 1726867427.58840: done checking for any_errors_fatal 23826 1726867427.58841: checking for max_fail_percentage 23826 1726867427.58842: done checking for max_fail_percentage 23826 1726867427.58842: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.58843: done checking to see if all hosts have failed 23826 1726867427.58844: getting the remaining hosts for this loop 23826 1726867427.58845: done getting the remaining hosts for this loop 23826 1726867427.58848: getting the next task for host managed_node2 23826 1726867427.58852: done getting next task for host managed_node2 23826 1726867427.58854: ^ task is: TASK: Get stat for interface {{ interface }} 23826 1726867427.58857: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.58860: getting variables 23826 1726867427.58861: in VariableManager get_vars() 23826 1726867427.58871: Calling all_inventory to load vars for managed_node2 23826 1726867427.58873: Calling groups_inventory to load vars for managed_node2 23826 1726867427.58875: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.59025: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.59028: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.59031: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.59352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.59662: done with get_vars() 23826 1726867427.59788: done getting variables 23826 1726867427.60088: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:23:47 -0400 (0:00:00.047) 0:00:09.612 ****** 23826 1726867427.60128: entering _queue_task() for managed_node2/stat 23826 1726867427.60503: worker is 1 (out of 1 available) 23826 1726867427.60517: exiting _queue_task() for managed_node2/stat 23826 1726867427.60527: done queuing things up, now waiting for results queue to drain 23826 1726867427.60528: waiting for pending results... 23826 1726867427.61031: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 23826 1726867427.61398: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000373 23826 1726867427.61452: variable 'ansible_search_path' from source: unknown 23826 1726867427.61456: variable 'ansible_search_path' from source: unknown 23826 1726867427.61459: calling self._execute() 23826 1726867427.61868: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.61959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.61963: variable 'omit' from source: magic vars 23826 1726867427.63214: variable 'ansible_distribution_major_version' from source: facts 23826 1726867427.63225: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867427.63231: variable 'omit' from source: magic vars 23826 1726867427.63311: variable 'omit' from source: magic vars 23826 1726867427.63474: variable 'interface' from source: set_fact 23826 1726867427.63497: variable 'omit' from source: magic vars 23826 1726867427.63536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867427.63610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867427.63682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867427.63687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867427.63689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867427.63692: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867427.63701: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.63704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.63811: Set connection var ansible_timeout to 10 23826 1726867427.63814: Set connection var ansible_shell_executable to /bin/sh 23826 1726867427.63817: Set connection var ansible_connection to ssh 23826 1726867427.63825: Set connection var ansible_pipelining to False 23826 1726867427.63828: Set connection var ansible_shell_type to sh 23826 1726867427.63834: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867427.63856: variable 'ansible_shell_executable' from source: unknown 23826 1726867427.63860: variable 'ansible_connection' from source: unknown 23826 1726867427.63862: variable 'ansible_module_compression' from source: unknown 23826 1726867427.63865: variable 'ansible_shell_type' from source: unknown 23826 1726867427.63867: variable 'ansible_shell_executable' from source: unknown 23826 1726867427.63869: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.63871: variable 'ansible_pipelining' from source: unknown 23826 1726867427.63874: variable 'ansible_timeout' from source: unknown 23826 1726867427.63917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.64086: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867427.64095: variable 'omit' from source: magic vars 23826 1726867427.64102: starting attempt loop 23826 1726867427.64104: running the handler 23826 1726867427.64126: _low_level_execute_command(): starting 23826 1726867427.64156: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867427.64893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.64924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867427.64935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867427.64939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867427.65021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.67112: stdout chunk (state=3): >>>/root <<< 23826 1726867427.67120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867427.67123: stdout chunk (state=3): >>><<< 23826 1726867427.67125: stderr chunk (state=3): >>><<< 23826 1726867427.67127: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867427.67130: _low_level_execute_command(): starting 23826 1726867427.67133: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570 `" && echo ansible-tmp-1726867427.6702342-24383-116062297272570="` echo /root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570 `" ) && sleep 0' 23826 1726867427.68006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867427.68128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867427.68305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.68319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867427.68340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867427.68500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.70389: stdout chunk (state=3): >>>ansible-tmp-1726867427.6702342-24383-116062297272570=/root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570 <<< 23826 1726867427.70502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867427.70523: stderr chunk (state=3): >>><<< 23826 1726867427.70526: stdout chunk (state=3): >>><<< 23826 1726867427.70541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867427.6702342-24383-116062297272570=/root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867427.70585: variable 'ansible_module_compression' from source: unknown 23826 1726867427.70635: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 23826 1726867427.70698: variable 'ansible_facts' from source: unknown 23826 1726867427.70756: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/AnsiballZ_stat.py 23826 1726867427.70947: Sending initial data 23826 1726867427.70950: Sent initial data (153 bytes) 23826 1726867427.71672: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867427.71687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867427.71818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.73402: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867427.73451: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867427.73493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp275lxxzu /root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/AnsiballZ_stat.py <<< 23826 1726867427.73500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/AnsiballZ_stat.py" <<< 23826 1726867427.73536: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp275lxxzu" to remote "/root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/AnsiballZ_stat.py" <<< 23826 1726867427.74349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867427.74409: stderr chunk (state=3): >>><<< 23826 1726867427.74412: stdout chunk (state=3): >>><<< 23826 1726867427.74414: done transferring module to remote 23826 1726867427.74416: _low_level_execute_command(): starting 23826 1726867427.74430: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/ /root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/AnsiballZ_stat.py && sleep 0' 23826 1726867427.75361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867427.75364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.75482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867427.75497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867427.75508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867427.75584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.77471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867427.77475: stdout chunk (state=3): >>><<< 23826 1726867427.77494: stderr chunk (state=3): >>><<< 23826 1726867427.77510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867427.77516: _low_level_execute_command(): starting 23826 1726867427.77583: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/AnsiballZ_stat.py && sleep 0' 23826 1726867427.78242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867427.78251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867427.78458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867427.78462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867427.78465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867427.78469: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867427.78472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.78478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867427.78482: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867427.78484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867427.78510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.94031: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28960, "dev": 23, "nlink": 1, "atime": 1726867425.9533818, "mtime": 1726867425.9533818, "ctime": 1726867425.9533818, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 23826 1726867427.95371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867427.95393: stderr chunk (state=3): >>><<< 23826 1726867427.95397: stdout chunk (state=3): >>><<< 23826 1726867427.95416: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28960, "dev": 23, "nlink": 1, "atime": 1726867425.9533818, "mtime": 1726867425.9533818, "ctime": 1726867425.9533818, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867427.95451: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867427.95460: _low_level_execute_command(): starting 23826 1726867427.95464: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867427.6702342-24383-116062297272570/ > /dev/null 2>&1 && sleep 0' 23826 1726867427.95962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867427.95965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.95969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867427.95971: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867427.95986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867427.96003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867427.96044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867427.97851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867427.97884: stderr chunk (state=3): >>><<< 23826 1726867427.97887: stdout chunk (state=3): >>><<< 23826 1726867427.97895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867427.97901: handler run complete 23826 1726867427.97934: attempt loop complete, returning result 23826 1726867427.97937: _execute() done 23826 1726867427.97939: dumping result to json 23826 1726867427.97944: done dumping result, returning 23826 1726867427.97952: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [0affcac9-a3a5-a92d-a3ea-000000000373] 23826 1726867427.97960: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000373 23826 1726867427.98087: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000373 23826 1726867427.98092: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726867425.9533818, "block_size": 4096, "blocks": 0, "ctime": 1726867425.9533818, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28960, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726867425.9533818, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 23826 1726867427.98373: no more pending results, returning what we have 23826 1726867427.98376: results queue empty 23826 1726867427.98379: checking for any_errors_fatal 23826 1726867427.98381: done checking for any_errors_fatal 23826 1726867427.98382: checking for max_fail_percentage 23826 1726867427.98383: done checking for max_fail_percentage 23826 1726867427.98384: checking to see if all hosts have failed and the running result is not ok 23826 1726867427.98385: done checking to see if all hosts have failed 23826 1726867427.98385: getting the remaining hosts for this loop 23826 1726867427.98386: done getting the remaining hosts for this loop 23826 1726867427.98390: getting the next task for host managed_node2 23826 1726867427.98397: done getting next task for host managed_node2 23826 1726867427.98399: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 23826 1726867427.98402: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867427.98406: getting variables 23826 1726867427.98409: in VariableManager get_vars() 23826 1726867427.98655: Calling all_inventory to load vars for managed_node2 23826 1726867427.98659: Calling groups_inventory to load vars for managed_node2 23826 1726867427.98662: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867427.98671: Calling all_plugins_play to load vars for managed_node2 23826 1726867427.98674: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867427.98690: Calling groups_plugins_play to load vars for managed_node2 23826 1726867427.98941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867427.99065: done with get_vars() 23826 1726867427.99073: done getting variables 23826 1726867427.99147: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 23826 1726867427.99232: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:23:47 -0400 (0:00:00.391) 0:00:10.003 ****** 23826 1726867427.99258: entering _queue_task() for managed_node2/assert 23826 1726867427.99259: Creating lock for assert 23826 1726867427.99565: worker is 1 (out of 1 available) 23826 1726867427.99582: exiting _queue_task() for managed_node2/assert 23826 1726867427.99600: done queuing things up, now waiting for results queue to drain 23826 1726867427.99602: waiting for pending results... 23826 1726867427.99747: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' 23826 1726867427.99826: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000002bd 23826 1726867427.99837: variable 'ansible_search_path' from source: unknown 23826 1726867427.99840: variable 'ansible_search_path' from source: unknown 23826 1726867427.99869: calling self._execute() 23826 1726867427.99947: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867427.99952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867427.99960: variable 'omit' from source: magic vars 23826 1726867428.00243: variable 'ansible_distribution_major_version' from source: facts 23826 1726867428.00247: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867428.00257: variable 'omit' from source: magic vars 23826 1726867428.00281: variable 'omit' from source: magic vars 23826 1726867428.00349: variable 'interface' from source: set_fact 23826 1726867428.00364: variable 'omit' from source: magic vars 23826 1726867428.00398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867428.00428: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867428.00443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867428.00456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867428.00473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867428.00495: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867428.00499: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.00501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.00608: Set connection var ansible_timeout to 10 23826 1726867428.00617: Set connection var ansible_shell_executable to /bin/sh 23826 1726867428.00620: Set connection var ansible_connection to ssh 23826 1726867428.00626: Set connection var ansible_pipelining to False 23826 1726867428.00628: Set connection var ansible_shell_type to sh 23826 1726867428.00651: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867428.00672: variable 'ansible_shell_executable' from source: unknown 23826 1726867428.00676: variable 'ansible_connection' from source: unknown 23826 1726867428.00693: variable 'ansible_module_compression' from source: unknown 23826 1726867428.00697: variable 'ansible_shell_type' from source: unknown 23826 1726867428.00700: variable 'ansible_shell_executable' from source: unknown 23826 1726867428.00702: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.00708: variable 'ansible_pipelining' from source: unknown 23826 1726867428.00739: variable 'ansible_timeout' from source: unknown 23826 1726867428.00742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.00838: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867428.00864: variable 'omit' from source: magic vars 23826 1726867428.00867: starting attempt loop 23826 1726867428.00869: running the handler 23826 1726867428.00985: variable 'interface_stat' from source: set_fact 23826 1726867428.01010: Evaluated conditional (interface_stat.stat.exists): True 23826 1726867428.01020: handler run complete 23826 1726867428.01029: attempt loop complete, returning result 23826 1726867428.01032: _execute() done 23826 1726867428.01049: dumping result to json 23826 1726867428.01052: done dumping result, returning 23826 1726867428.01055: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' [0affcac9-a3a5-a92d-a3ea-0000000002bd] 23826 1726867428.01069: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000002bd 23826 1726867428.01157: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000002bd 23826 1726867428.01160: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 23826 1726867428.01218: no more pending results, returning what we have 23826 1726867428.01222: results queue empty 23826 1726867428.01223: checking for any_errors_fatal 23826 1726867428.01229: done checking for any_errors_fatal 23826 1726867428.01230: checking for max_fail_percentage 23826 1726867428.01232: done checking for max_fail_percentage 23826 1726867428.01233: checking to see if all hosts have failed and the running result is not ok 23826 1726867428.01234: done checking to see if all hosts have failed 23826 1726867428.01235: getting the remaining hosts for this loop 23826 1726867428.01237: done getting the remaining hosts for this loop 23826 1726867428.01240: getting the next task for host managed_node2 23826 1726867428.01247: done getting next task for host managed_node2 23826 1726867428.01249: ^ task is: TASK: Initialize the connection_failed flag 23826 1726867428.01250: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867428.01254: getting variables 23826 1726867428.01256: in VariableManager get_vars() 23826 1726867428.01294: Calling all_inventory to load vars for managed_node2 23826 1726867428.01297: Calling groups_inventory to load vars for managed_node2 23826 1726867428.01300: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867428.01313: Calling all_plugins_play to load vars for managed_node2 23826 1726867428.01317: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867428.01321: Calling groups_plugins_play to load vars for managed_node2 23826 1726867428.01475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867428.01692: done with get_vars() 23826 1726867428.01699: done getting variables 23826 1726867428.01744: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize the connection_failed flag] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:23 Friday 20 September 2024 17:23:48 -0400 (0:00:00.025) 0:00:10.028 ****** 23826 1726867428.01762: entering _queue_task() for managed_node2/set_fact 23826 1726867428.01947: worker is 1 (out of 1 available) 23826 1726867428.01959: exiting _queue_task() for managed_node2/set_fact 23826 1726867428.01970: done queuing things up, now waiting for results queue to drain 23826 1726867428.01971: waiting for pending results... 23826 1726867428.02126: running TaskExecutor() for managed_node2/TASK: Initialize the connection_failed flag 23826 1726867428.02171: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000000f 23826 1726867428.02186: variable 'ansible_search_path' from source: unknown 23826 1726867428.02218: calling self._execute() 23826 1726867428.02279: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.02286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.02294: variable 'omit' from source: magic vars 23826 1726867428.02600: variable 'ansible_distribution_major_version' from source: facts 23826 1726867428.02611: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867428.02614: variable 'omit' from source: magic vars 23826 1726867428.02630: variable 'omit' from source: magic vars 23826 1726867428.02685: variable 'omit' from source: magic vars 23826 1726867428.02700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867428.02729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867428.02756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867428.02764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867428.02778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867428.02819: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867428.02822: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.02825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.02914: Set connection var ansible_timeout to 10 23826 1726867428.02922: Set connection var ansible_shell_executable to /bin/sh 23826 1726867428.02925: Set connection var ansible_connection to ssh 23826 1726867428.02929: Set connection var ansible_pipelining to False 23826 1726867428.02931: Set connection var ansible_shell_type to sh 23826 1726867428.02933: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867428.02983: variable 'ansible_shell_executable' from source: unknown 23826 1726867428.02987: variable 'ansible_connection' from source: unknown 23826 1726867428.02989: variable 'ansible_module_compression' from source: unknown 23826 1726867428.02992: variable 'ansible_shell_type' from source: unknown 23826 1726867428.02994: variable 'ansible_shell_executable' from source: unknown 23826 1726867428.02996: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.02998: variable 'ansible_pipelining' from source: unknown 23826 1726867428.03010: variable 'ansible_timeout' from source: unknown 23826 1726867428.03014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.03135: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867428.03143: variable 'omit' from source: magic vars 23826 1726867428.03154: starting attempt loop 23826 1726867428.03160: running the handler 23826 1726867428.03163: handler run complete 23826 1726867428.03194: attempt loop complete, returning result 23826 1726867428.03199: _execute() done 23826 1726867428.03202: dumping result to json 23826 1726867428.03203: done dumping result, returning 23826 1726867428.03206: done running TaskExecutor() for managed_node2/TASK: Initialize the connection_failed flag [0affcac9-a3a5-a92d-a3ea-00000000000f] 23826 1726867428.03216: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000f 23826 1726867428.03276: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000000f 23826 1726867428.03282: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "connection_failed": false }, "changed": false } 23826 1726867428.03349: no more pending results, returning what we have 23826 1726867428.03352: results queue empty 23826 1726867428.03353: checking for any_errors_fatal 23826 1726867428.03358: done checking for any_errors_fatal 23826 1726867428.03359: checking for max_fail_percentage 23826 1726867428.03360: done checking for max_fail_percentage 23826 1726867428.03360: checking to see if all hosts have failed and the running result is not ok 23826 1726867428.03361: done checking to see if all hosts have failed 23826 1726867428.03362: getting the remaining hosts for this loop 23826 1726867428.03365: done getting the remaining hosts for this loop 23826 1726867428.03370: getting the next task for host managed_node2 23826 1726867428.03413: done getting next task for host managed_node2 23826 1726867428.03417: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 23826 1726867428.03420: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867428.03430: getting variables 23826 1726867428.03431: in VariableManager get_vars() 23826 1726867428.03452: Calling all_inventory to load vars for managed_node2 23826 1726867428.03454: Calling groups_inventory to load vars for managed_node2 23826 1726867428.03455: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867428.03460: Calling all_plugins_play to load vars for managed_node2 23826 1726867428.03462: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867428.03463: Calling groups_plugins_play to load vars for managed_node2 23826 1726867428.03637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867428.03797: done with get_vars() 23826 1726867428.03808: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:23:48 -0400 (0:00:00.021) 0:00:10.050 ****** 23826 1726867428.03893: entering _queue_task() for managed_node2/include_tasks 23826 1726867428.04072: worker is 1 (out of 1 available) 23826 1726867428.04085: exiting _queue_task() for managed_node2/include_tasks 23826 1726867428.04098: done queuing things up, now waiting for results queue to drain 23826 1726867428.04099: waiting for pending results... 23826 1726867428.04315: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 23826 1726867428.04406: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000017 23826 1726867428.04419: variable 'ansible_search_path' from source: unknown 23826 1726867428.04422: variable 'ansible_search_path' from source: unknown 23826 1726867428.04491: calling self._execute() 23826 1726867428.04554: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.04558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.04567: variable 'omit' from source: magic vars 23826 1726867428.04834: variable 'ansible_distribution_major_version' from source: facts 23826 1726867428.04843: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867428.04848: _execute() done 23826 1726867428.04851: dumping result to json 23826 1726867428.04857: done dumping result, returning 23826 1726867428.04863: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-a92d-a3ea-000000000017] 23826 1726867428.04872: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000017 23826 1726867428.04952: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000017 23826 1726867428.04955: WORKER PROCESS EXITING 23826 1726867428.05012: no more pending results, returning what we have 23826 1726867428.05016: in VariableManager get_vars() 23826 1726867428.05048: Calling all_inventory to load vars for managed_node2 23826 1726867428.05050: Calling groups_inventory to load vars for managed_node2 23826 1726867428.05052: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867428.05060: Calling all_plugins_play to load vars for managed_node2 23826 1726867428.05062: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867428.05064: Calling groups_plugins_play to load vars for managed_node2 23826 1726867428.05240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867428.05399: done with get_vars() 23826 1726867428.05408: variable 'ansible_search_path' from source: unknown 23826 1726867428.05410: variable 'ansible_search_path' from source: unknown 23826 1726867428.05440: we have included files to process 23826 1726867428.05441: generating all_blocks data 23826 1726867428.05442: done generating all_blocks data 23826 1726867428.05444: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867428.05445: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867428.05446: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867428.05988: done processing included file 23826 1726867428.05990: iterating over new_blocks loaded from include file 23826 1726867428.05991: in VariableManager get_vars() 23826 1726867428.06004: done with get_vars() 23826 1726867428.06005: filtering new block on tags 23826 1726867428.06017: done filtering new block on tags 23826 1726867428.06019: in VariableManager get_vars() 23826 1726867428.06030: done with get_vars() 23826 1726867428.06031: filtering new block on tags 23826 1726867428.06043: done filtering new block on tags 23826 1726867428.06045: in VariableManager get_vars() 23826 1726867428.06073: done with get_vars() 23826 1726867428.06075: filtering new block on tags 23826 1726867428.06098: done filtering new block on tags 23826 1726867428.06100: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 23826 1726867428.06105: extending task lists for all hosts with included blocks 23826 1726867428.06580: done extending task lists 23826 1726867428.06581: done processing included files 23826 1726867428.06582: results queue empty 23826 1726867428.06582: checking for any_errors_fatal 23826 1726867428.06584: done checking for any_errors_fatal 23826 1726867428.06584: checking for max_fail_percentage 23826 1726867428.06585: done checking for max_fail_percentage 23826 1726867428.06586: checking to see if all hosts have failed and the running result is not ok 23826 1726867428.06586: done checking to see if all hosts have failed 23826 1726867428.06587: getting the remaining hosts for this loop 23826 1726867428.06587: done getting the remaining hosts for this loop 23826 1726867428.06589: getting the next task for host managed_node2 23826 1726867428.06591: done getting next task for host managed_node2 23826 1726867428.06593: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 23826 1726867428.06595: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867428.06601: getting variables 23826 1726867428.06601: in VariableManager get_vars() 23826 1726867428.06611: Calling all_inventory to load vars for managed_node2 23826 1726867428.06612: Calling groups_inventory to load vars for managed_node2 23826 1726867428.06614: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867428.06616: Calling all_plugins_play to load vars for managed_node2 23826 1726867428.06618: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867428.06620: Calling groups_plugins_play to load vars for managed_node2 23826 1726867428.06718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867428.06830: done with get_vars() 23826 1726867428.06836: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:23:48 -0400 (0:00:00.029) 0:00:10.080 ****** 23826 1726867428.06887: entering _queue_task() for managed_node2/setup 23826 1726867428.07072: worker is 1 (out of 1 available) 23826 1726867428.07085: exiting _queue_task() for managed_node2/setup 23826 1726867428.07101: done queuing things up, now waiting for results queue to drain 23826 1726867428.07103: waiting for pending results... 23826 1726867428.07321: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 23826 1726867428.07394: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000038e 23826 1726867428.07404: variable 'ansible_search_path' from source: unknown 23826 1726867428.07413: variable 'ansible_search_path' from source: unknown 23826 1726867428.07437: calling self._execute() 23826 1726867428.07496: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.07499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.07510: variable 'omit' from source: magic vars 23826 1726867428.07749: variable 'ansible_distribution_major_version' from source: facts 23826 1726867428.07759: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867428.07903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867428.09402: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867428.09448: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867428.09475: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867428.09503: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867428.09523: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867428.09582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867428.09603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867428.09621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867428.09648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867428.09661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867428.09700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867428.09719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867428.09737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867428.09781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867428.09819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867428.09970: variable '__network_required_facts' from source: role '' defaults 23826 1726867428.09973: variable 'ansible_facts' from source: unknown 23826 1726867428.10042: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 23826 1726867428.10046: when evaluation is False, skipping this task 23826 1726867428.10048: _execute() done 23826 1726867428.10051: dumping result to json 23826 1726867428.10055: done dumping result, returning 23826 1726867428.10081: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-a92d-a3ea-00000000038e] 23826 1726867428.10085: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000038e 23826 1726867428.10163: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000038e 23826 1726867428.10170: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867428.10230: no more pending results, returning what we have 23826 1726867428.10233: results queue empty 23826 1726867428.10234: checking for any_errors_fatal 23826 1726867428.10239: done checking for any_errors_fatal 23826 1726867428.10239: checking for max_fail_percentage 23826 1726867428.10241: done checking for max_fail_percentage 23826 1726867428.10242: checking to see if all hosts have failed and the running result is not ok 23826 1726867428.10242: done checking to see if all hosts have failed 23826 1726867428.10243: getting the remaining hosts for this loop 23826 1726867428.10244: done getting the remaining hosts for this loop 23826 1726867428.10247: getting the next task for host managed_node2 23826 1726867428.10254: done getting next task for host managed_node2 23826 1726867428.10258: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 23826 1726867428.10261: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867428.10272: getting variables 23826 1726867428.10273: in VariableManager get_vars() 23826 1726867428.10309: Calling all_inventory to load vars for managed_node2 23826 1726867428.10316: Calling groups_inventory to load vars for managed_node2 23826 1726867428.10319: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867428.10327: Calling all_plugins_play to load vars for managed_node2 23826 1726867428.10330: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867428.10335: Calling groups_plugins_play to load vars for managed_node2 23826 1726867428.10451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867428.10659: done with get_vars() 23826 1726867428.10671: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:23:48 -0400 (0:00:00.039) 0:00:10.119 ****** 23826 1726867428.10819: entering _queue_task() for managed_node2/stat 23826 1726867428.11092: worker is 1 (out of 1 available) 23826 1726867428.11105: exiting _queue_task() for managed_node2/stat 23826 1726867428.11118: done queuing things up, now waiting for results queue to drain 23826 1726867428.11120: waiting for pending results... 23826 1726867428.11261: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 23826 1726867428.11352: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000390 23826 1726867428.11361: variable 'ansible_search_path' from source: unknown 23826 1726867428.11365: variable 'ansible_search_path' from source: unknown 23826 1726867428.11391: calling self._execute() 23826 1726867428.11443: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.11446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.11472: variable 'omit' from source: magic vars 23826 1726867428.11946: variable 'ansible_distribution_major_version' from source: facts 23826 1726867428.11949: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867428.12205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867428.12645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867428.12691: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867428.12724: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867428.12871: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867428.12949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867428.12971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867428.13029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867428.13033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867428.13402: variable '__network_is_ostree' from source: set_fact 23826 1726867428.13418: Evaluated conditional (not __network_is_ostree is defined): False 23826 1726867428.13425: when evaluation is False, skipping this task 23826 1726867428.13432: _execute() done 23826 1726867428.13491: dumping result to json 23826 1726867428.13495: done dumping result, returning 23826 1726867428.13497: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-a92d-a3ea-000000000390] 23826 1726867428.13500: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000390 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 23826 1726867428.13643: no more pending results, returning what we have 23826 1726867428.13647: results queue empty 23826 1726867428.13648: checking for any_errors_fatal 23826 1726867428.13656: done checking for any_errors_fatal 23826 1726867428.13657: checking for max_fail_percentage 23826 1726867428.13659: done checking for max_fail_percentage 23826 1726867428.13660: checking to see if all hosts have failed and the running result is not ok 23826 1726867428.13661: done checking to see if all hosts have failed 23826 1726867428.13662: getting the remaining hosts for this loop 23826 1726867428.13663: done getting the remaining hosts for this loop 23826 1726867428.13667: getting the next task for host managed_node2 23826 1726867428.13675: done getting next task for host managed_node2 23826 1726867428.13681: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 23826 1726867428.13685: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867428.13700: getting variables 23826 1726867428.13702: in VariableManager get_vars() 23826 1726867428.13748: Calling all_inventory to load vars for managed_node2 23826 1726867428.13751: Calling groups_inventory to load vars for managed_node2 23826 1726867428.13754: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867428.13764: Calling all_plugins_play to load vars for managed_node2 23826 1726867428.13767: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867428.13770: Calling groups_plugins_play to load vars for managed_node2 23826 1726867428.13893: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000390 23826 1726867428.13896: WORKER PROCESS EXITING 23826 1726867428.14111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867428.14236: done with get_vars() 23826 1726867428.14244: done getting variables 23826 1726867428.14284: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:23:48 -0400 (0:00:00.034) 0:00:10.154 ****** 23826 1726867428.14310: entering _queue_task() for managed_node2/set_fact 23826 1726867428.14504: worker is 1 (out of 1 available) 23826 1726867428.14520: exiting _queue_task() for managed_node2/set_fact 23826 1726867428.14531: done queuing things up, now waiting for results queue to drain 23826 1726867428.14533: waiting for pending results... 23826 1726867428.14686: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 23826 1726867428.14773: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000391 23826 1726867428.14786: variable 'ansible_search_path' from source: unknown 23826 1726867428.14790: variable 'ansible_search_path' from source: unknown 23826 1726867428.14817: calling self._execute() 23826 1726867428.14878: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.14885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.14893: variable 'omit' from source: magic vars 23826 1726867428.15144: variable 'ansible_distribution_major_version' from source: facts 23826 1726867428.15153: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867428.15267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867428.15510: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867428.15542: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867428.15567: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867428.15593: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867428.15655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867428.15672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867428.15692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867428.15712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867428.15775: variable '__network_is_ostree' from source: set_fact 23826 1726867428.15781: Evaluated conditional (not __network_is_ostree is defined): False 23826 1726867428.15787: when evaluation is False, skipping this task 23826 1726867428.15791: _execute() done 23826 1726867428.15795: dumping result to json 23826 1726867428.15798: done dumping result, returning 23826 1726867428.15823: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-a92d-a3ea-000000000391] 23826 1726867428.15826: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000391 23826 1726867428.15896: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000391 23826 1726867428.15899: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 23826 1726867428.16048: no more pending results, returning what we have 23826 1726867428.16051: results queue empty 23826 1726867428.16053: checking for any_errors_fatal 23826 1726867428.16058: done checking for any_errors_fatal 23826 1726867428.16059: checking for max_fail_percentage 23826 1726867428.16061: done checking for max_fail_percentage 23826 1726867428.16062: checking to see if all hosts have failed and the running result is not ok 23826 1726867428.16065: done checking to see if all hosts have failed 23826 1726867428.16066: getting the remaining hosts for this loop 23826 1726867428.16067: done getting the remaining hosts for this loop 23826 1726867428.16071: getting the next task for host managed_node2 23826 1726867428.16083: done getting next task for host managed_node2 23826 1726867428.16088: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 23826 1726867428.16091: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867428.16104: getting variables 23826 1726867428.16106: in VariableManager get_vars() 23826 1726867428.16137: Calling all_inventory to load vars for managed_node2 23826 1726867428.16139: Calling groups_inventory to load vars for managed_node2 23826 1726867428.16142: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867428.16149: Calling all_plugins_play to load vars for managed_node2 23826 1726867428.16151: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867428.16153: Calling groups_plugins_play to load vars for managed_node2 23826 1726867428.17254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867428.17870: done with get_vars() 23826 1726867428.17881: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:23:48 -0400 (0:00:00.037) 0:00:10.192 ****** 23826 1726867428.18091: entering _queue_task() for managed_node2/service_facts 23826 1726867428.18093: Creating lock for service_facts 23826 1726867428.18535: worker is 1 (out of 1 available) 23826 1726867428.18547: exiting _queue_task() for managed_node2/service_facts 23826 1726867428.18559: done queuing things up, now waiting for results queue to drain 23826 1726867428.18561: waiting for pending results... 23826 1726867428.19073: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 23826 1726867428.19081: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000393 23826 1726867428.19085: variable 'ansible_search_path' from source: unknown 23826 1726867428.19094: variable 'ansible_search_path' from source: unknown 23826 1726867428.19138: calling self._execute() 23826 1726867428.19233: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.19244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.19258: variable 'omit' from source: magic vars 23826 1726867428.19658: variable 'ansible_distribution_major_version' from source: facts 23826 1726867428.19673: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867428.19714: variable 'omit' from source: magic vars 23826 1726867428.19769: variable 'omit' from source: magic vars 23826 1726867428.19820: variable 'omit' from source: magic vars 23826 1726867428.19871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867428.19933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867428.20053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867428.20057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867428.20060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867428.20062: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867428.20064: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.20066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.20146: Set connection var ansible_timeout to 10 23826 1726867428.20166: Set connection var ansible_shell_executable to /bin/sh 23826 1726867428.20173: Set connection var ansible_connection to ssh 23826 1726867428.20192: Set connection var ansible_pipelining to False 23826 1726867428.20199: Set connection var ansible_shell_type to sh 23826 1726867428.20209: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867428.20235: variable 'ansible_shell_executable' from source: unknown 23826 1726867428.20244: variable 'ansible_connection' from source: unknown 23826 1726867428.20252: variable 'ansible_module_compression' from source: unknown 23826 1726867428.20258: variable 'ansible_shell_type' from source: unknown 23826 1726867428.20273: variable 'ansible_shell_executable' from source: unknown 23826 1726867428.20381: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867428.20384: variable 'ansible_pipelining' from source: unknown 23826 1726867428.20386: variable 'ansible_timeout' from source: unknown 23826 1726867428.20389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867428.20516: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867428.20531: variable 'omit' from source: magic vars 23826 1726867428.20540: starting attempt loop 23826 1726867428.20546: running the handler 23826 1726867428.20562: _low_level_execute_command(): starting 23826 1726867428.20574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867428.21369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867428.21395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867428.21482: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867428.21505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867428.21531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867428.21545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867428.21715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867428.23388: stdout chunk (state=3): >>>/root <<< 23826 1726867428.23526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867428.23732: stdout chunk (state=3): >>><<< 23826 1726867428.23736: stderr chunk (state=3): >>><<< 23826 1726867428.23739: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867428.23742: _low_level_execute_command(): starting 23826 1726867428.23744: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183 `" && echo ansible-tmp-1726867428.2364857-24415-90622163830183="` echo /root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183 `" ) && sleep 0' 23826 1726867428.24591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867428.24687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867428.24703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867428.24736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867428.24750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867428.24775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867428.24845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867428.26962: stdout chunk (state=3): >>>ansible-tmp-1726867428.2364857-24415-90622163830183=/root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183 <<< 23826 1726867428.27084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867428.27088: stdout chunk (state=3): >>><<< 23826 1726867428.27090: stderr chunk (state=3): >>><<< 23826 1726867428.27093: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867428.2364857-24415-90622163830183=/root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867428.27096: variable 'ansible_module_compression' from source: unknown 23826 1726867428.27325: ANSIBALLZ: Using lock for service_facts 23826 1726867428.27328: ANSIBALLZ: Acquiring lock 23826 1726867428.27330: ANSIBALLZ: Lock acquired: 139851310861392 23826 1726867428.27332: ANSIBALLZ: Creating module 23826 1726867428.42549: ANSIBALLZ: Writing module into payload 23826 1726867428.42648: ANSIBALLZ: Writing module 23826 1726867428.42675: ANSIBALLZ: Renaming module 23826 1726867428.42681: ANSIBALLZ: Done creating module 23826 1726867428.42704: variable 'ansible_facts' from source: unknown 23826 1726867428.42945: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/AnsiballZ_service_facts.py 23826 1726867428.42949: Sending initial data 23826 1726867428.42951: Sent initial data (161 bytes) 23826 1726867428.43550: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867428.43579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867428.43688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867428.43691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867428.43851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867428.45512: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867428.45567: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867428.45685: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp76ilezfq /root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/AnsiballZ_service_facts.py <<< 23826 1726867428.45689: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/AnsiballZ_service_facts.py" <<< 23826 1726867428.45731: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp76ilezfq" to remote "/root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/AnsiballZ_service_facts.py" <<< 23826 1726867428.47475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867428.47481: stdout chunk (state=3): >>><<< 23826 1726867428.47483: stderr chunk (state=3): >>><<< 23826 1726867428.47683: done transferring module to remote 23826 1726867428.47688: _low_level_execute_command(): starting 23826 1726867428.47691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/ /root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/AnsiballZ_service_facts.py && sleep 0' 23826 1726867428.48926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867428.48930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867428.48932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867428.48935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867428.49183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867428.49190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867428.49292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867428.51080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867428.51284: stderr chunk (state=3): >>><<< 23826 1726867428.51287: stdout chunk (state=3): >>><<< 23826 1726867428.51290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867428.51292: _low_level_execute_command(): starting 23826 1726867428.51295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/AnsiballZ_service_facts.py && sleep 0' 23826 1726867428.52286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867428.52295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867428.52311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867428.52314: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867428.52325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867428.52341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867428.52353: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867428.52361: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867428.52370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867428.52382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867428.52500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867428.52504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867428.52506: stderr chunk (state=3): >>>debug2: match found <<< 23826 1726867428.52508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867428.52510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867428.52613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867428.52754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867430.12714: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 23826 1726867430.12739: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 23826 1726867430.12778: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 23826 1726867430.12788: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source"<<< 23826 1726867430.12798: stdout chunk (state=3): >>>: "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 23826 1726867430.14382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867430.14385: stdout chunk (state=3): >>><<< 23826 1726867430.14582: stderr chunk (state=3): >>><<< 23826 1726867430.14590: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867430.16189: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867430.16206: _low_level_execute_command(): starting 23826 1726867430.16220: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867428.2364857-24415-90622163830183/ > /dev/null 2>&1 && sleep 0' 23826 1726867430.16858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867430.16871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867430.16890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867430.16938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867430.16949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.17017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867430.17046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867430.17067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867430.18950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867430.18953: stdout chunk (state=3): >>><<< 23826 1726867430.19182: stderr chunk (state=3): >>><<< 23826 1726867430.19189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867430.19192: handler run complete 23826 1726867430.19194: variable 'ansible_facts' from source: unknown 23826 1726867430.19327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867430.19792: variable 'ansible_facts' from source: unknown 23826 1726867430.19931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867430.20149: attempt loop complete, returning result 23826 1726867430.20161: _execute() done 23826 1726867430.20169: dumping result to json 23826 1726867430.20242: done dumping result, returning 23826 1726867430.20256: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-a92d-a3ea-000000000393] 23826 1726867430.20265: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000393 23826 1726867430.21286: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000393 23826 1726867430.21289: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867430.21391: no more pending results, returning what we have 23826 1726867430.21394: results queue empty 23826 1726867430.21395: checking for any_errors_fatal 23826 1726867430.21397: done checking for any_errors_fatal 23826 1726867430.21398: checking for max_fail_percentage 23826 1726867430.21399: done checking for max_fail_percentage 23826 1726867430.21400: checking to see if all hosts have failed and the running result is not ok 23826 1726867430.21400: done checking to see if all hosts have failed 23826 1726867430.21401: getting the remaining hosts for this loop 23826 1726867430.21402: done getting the remaining hosts for this loop 23826 1726867430.21405: getting the next task for host managed_node2 23826 1726867430.21411: done getting next task for host managed_node2 23826 1726867430.21414: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 23826 1726867430.21418: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867430.21425: getting variables 23826 1726867430.21426: in VariableManager get_vars() 23826 1726867430.21451: Calling all_inventory to load vars for managed_node2 23826 1726867430.21453: Calling groups_inventory to load vars for managed_node2 23826 1726867430.21455: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867430.21462: Calling all_plugins_play to load vars for managed_node2 23826 1726867430.21465: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867430.21467: Calling groups_plugins_play to load vars for managed_node2 23826 1726867430.21797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867430.22257: done with get_vars() 23826 1726867430.22267: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:23:50 -0400 (0:00:02.042) 0:00:12.234 ****** 23826 1726867430.22356: entering _queue_task() for managed_node2/package_facts 23826 1726867430.22358: Creating lock for package_facts 23826 1726867430.22742: worker is 1 (out of 1 available) 23826 1726867430.22752: exiting _queue_task() for managed_node2/package_facts 23826 1726867430.22763: done queuing things up, now waiting for results queue to drain 23826 1726867430.22764: waiting for pending results... 23826 1726867430.23498: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 23826 1726867430.23663: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000394 23826 1726867430.23783: variable 'ansible_search_path' from source: unknown 23826 1726867430.23786: variable 'ansible_search_path' from source: unknown 23826 1726867430.23789: calling self._execute() 23826 1726867430.23968: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867430.23984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867430.24063: variable 'omit' from source: magic vars 23826 1726867430.24507: variable 'ansible_distribution_major_version' from source: facts 23826 1726867430.24519: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867430.24524: variable 'omit' from source: magic vars 23826 1726867430.24578: variable 'omit' from source: magic vars 23826 1726867430.24605: variable 'omit' from source: magic vars 23826 1726867430.24639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867430.24666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867430.24683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867430.24782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867430.24786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867430.24788: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867430.24792: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867430.24794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867430.24811: Set connection var ansible_timeout to 10 23826 1726867430.24816: Set connection var ansible_shell_executable to /bin/sh 23826 1726867430.24819: Set connection var ansible_connection to ssh 23826 1726867430.24825: Set connection var ansible_pipelining to False 23826 1726867430.24828: Set connection var ansible_shell_type to sh 23826 1726867430.24833: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867430.24851: variable 'ansible_shell_executable' from source: unknown 23826 1726867430.24856: variable 'ansible_connection' from source: unknown 23826 1726867430.24859: variable 'ansible_module_compression' from source: unknown 23826 1726867430.24861: variable 'ansible_shell_type' from source: unknown 23826 1726867430.24863: variable 'ansible_shell_executable' from source: unknown 23826 1726867430.24865: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867430.24867: variable 'ansible_pipelining' from source: unknown 23826 1726867430.24870: variable 'ansible_timeout' from source: unknown 23826 1726867430.24874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867430.25016: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867430.25024: variable 'omit' from source: magic vars 23826 1726867430.25035: starting attempt loop 23826 1726867430.25039: running the handler 23826 1726867430.25049: _low_level_execute_command(): starting 23826 1726867430.25056: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867430.25542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867430.25545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.25548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867430.25550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.25605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867430.25608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867430.25613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867430.25660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867430.27356: stdout chunk (state=3): >>>/root <<< 23826 1726867430.27465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867430.27468: stdout chunk (state=3): >>><<< 23826 1726867430.27470: stderr chunk (state=3): >>><<< 23826 1726867430.27497: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867430.27553: _low_level_execute_command(): starting 23826 1726867430.27557: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631 `" && echo ansible-tmp-1726867430.2750363-24518-59304416433631="` echo /root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631 `" ) && sleep 0' 23826 1726867430.28133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867430.28146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867430.28160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867430.28192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.28297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867430.28320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867430.28397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867430.30291: stdout chunk (state=3): >>>ansible-tmp-1726867430.2750363-24518-59304416433631=/root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631 <<< 23826 1726867430.30459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867430.30463: stdout chunk (state=3): >>><<< 23826 1726867430.30465: stderr chunk (state=3): >>><<< 23826 1726867430.30483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867430.2750363-24518-59304416433631=/root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867430.30554: variable 'ansible_module_compression' from source: unknown 23826 1726867430.30599: ANSIBALLZ: Using lock for package_facts 23826 1726867430.30660: ANSIBALLZ: Acquiring lock 23826 1726867430.30663: ANSIBALLZ: Lock acquired: 139851305109168 23826 1726867430.30665: ANSIBALLZ: Creating module 23826 1726867430.72614: ANSIBALLZ: Writing module into payload 23826 1726867430.72805: ANSIBALLZ: Writing module 23826 1726867430.72812: ANSIBALLZ: Renaming module 23826 1726867430.72815: ANSIBALLZ: Done creating module 23826 1726867430.72915: variable 'ansible_facts' from source: unknown 23826 1726867430.73031: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/AnsiballZ_package_facts.py 23826 1726867430.73220: Sending initial data 23826 1726867430.73224: Sent initial data (161 bytes) 23826 1726867430.73857: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867430.73860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867430.73863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867430.73865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867430.73867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867430.73869: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867430.73871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.73874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.73902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867430.73916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867430.73976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867430.74095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867430.75806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867430.76031: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867430.76182: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp3sgnvg9o /root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/AnsiballZ_package_facts.py <<< 23826 1726867430.76215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/AnsiballZ_package_facts.py" <<< 23826 1726867430.76229: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp3sgnvg9o" to remote "/root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/AnsiballZ_package_facts.py" <<< 23826 1726867430.78354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867430.78447: stderr chunk (state=3): >>><<< 23826 1726867430.78456: stdout chunk (state=3): >>><<< 23826 1726867430.78497: done transferring module to remote 23826 1726867430.78506: _low_level_execute_command(): starting 23826 1726867430.78511: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/ /root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/AnsiballZ_package_facts.py && sleep 0' 23826 1726867430.78916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867430.78920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.78937: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.78987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867430.78991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867430.79038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867430.80958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867430.80966: stderr chunk (state=3): >>><<< 23826 1726867430.80968: stdout chunk (state=3): >>><<< 23826 1726867430.80991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867430.80994: _low_level_execute_command(): starting 23826 1726867430.80997: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/AnsiballZ_package_facts.py && sleep 0' 23826 1726867430.81594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867430.81602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867430.81713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867430.81717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867430.81720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867430.81793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867431.26288: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 23826 1726867431.26317: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 23826 1726867431.26326: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 23826 1726867431.26347: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 23826 1726867431.26403: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 23826 1726867431.26445: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 23826 1726867431.28251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867431.28254: stdout chunk (state=3): >>><<< 23826 1726867431.28256: stderr chunk (state=3): >>><<< 23826 1726867431.28489: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867431.31169: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867431.31201: _low_level_execute_command(): starting 23826 1726867431.31213: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867430.2750363-24518-59304416433631/ > /dev/null 2>&1 && sleep 0' 23826 1726867431.31866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867431.31991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867431.32014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867431.32089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867431.33992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867431.34010: stdout chunk (state=3): >>><<< 23826 1726867431.34062: stderr chunk (state=3): >>><<< 23826 1726867431.34066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867431.34068: handler run complete 23826 1726867431.34579: variable 'ansible_facts' from source: unknown 23826 1726867431.34820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.35997: variable 'ansible_facts' from source: unknown 23826 1726867431.39354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.39987: attempt loop complete, returning result 23826 1726867431.39997: _execute() done 23826 1726867431.40000: dumping result to json 23826 1726867431.40283: done dumping result, returning 23826 1726867431.40287: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-a92d-a3ea-000000000394] 23826 1726867431.40289: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000394 23826 1726867431.41558: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000394 23826 1726867431.41562: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867431.41606: no more pending results, returning what we have 23826 1726867431.41611: results queue empty 23826 1726867431.41611: checking for any_errors_fatal 23826 1726867431.41617: done checking for any_errors_fatal 23826 1726867431.41618: checking for max_fail_percentage 23826 1726867431.41619: done checking for max_fail_percentage 23826 1726867431.41620: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.41621: done checking to see if all hosts have failed 23826 1726867431.41621: getting the remaining hosts for this loop 23826 1726867431.41622: done getting the remaining hosts for this loop 23826 1726867431.41625: getting the next task for host managed_node2 23826 1726867431.41629: done getting next task for host managed_node2 23826 1726867431.41631: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 23826 1726867431.41634: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.41641: getting variables 23826 1726867431.41642: in VariableManager get_vars() 23826 1726867431.41665: Calling all_inventory to load vars for managed_node2 23826 1726867431.41667: Calling groups_inventory to load vars for managed_node2 23826 1726867431.41668: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.41675: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.41676: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.41680: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.42685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.44075: done with get_vars() 23826 1726867431.44093: done getting variables 23826 1726867431.44138: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:23:51 -0400 (0:00:01.218) 0:00:13.452 ****** 23826 1726867431.44163: entering _queue_task() for managed_node2/debug 23826 1726867431.44391: worker is 1 (out of 1 available) 23826 1726867431.44405: exiting _queue_task() for managed_node2/debug 23826 1726867431.44419: done queuing things up, now waiting for results queue to drain 23826 1726867431.44420: waiting for pending results... 23826 1726867431.44591: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 23826 1726867431.44669: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000018 23826 1726867431.44685: variable 'ansible_search_path' from source: unknown 23826 1726867431.44689: variable 'ansible_search_path' from source: unknown 23826 1726867431.44726: calling self._execute() 23826 1726867431.44793: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.44799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.44810: variable 'omit' from source: magic vars 23826 1726867431.45078: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.45089: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.45093: variable 'omit' from source: magic vars 23826 1726867431.45127: variable 'omit' from source: magic vars 23826 1726867431.45199: variable 'network_provider' from source: set_fact 23826 1726867431.45212: variable 'omit' from source: magic vars 23826 1726867431.45243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867431.45271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867431.45288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867431.45309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867431.45315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867431.45338: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867431.45341: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.45344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.45418: Set connection var ansible_timeout to 10 23826 1726867431.45421: Set connection var ansible_shell_executable to /bin/sh 23826 1726867431.45424: Set connection var ansible_connection to ssh 23826 1726867431.45426: Set connection var ansible_pipelining to False 23826 1726867431.45430: Set connection var ansible_shell_type to sh 23826 1726867431.45435: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867431.45453: variable 'ansible_shell_executable' from source: unknown 23826 1726867431.45456: variable 'ansible_connection' from source: unknown 23826 1726867431.45458: variable 'ansible_module_compression' from source: unknown 23826 1726867431.45461: variable 'ansible_shell_type' from source: unknown 23826 1726867431.45463: variable 'ansible_shell_executable' from source: unknown 23826 1726867431.45465: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.45467: variable 'ansible_pipelining' from source: unknown 23826 1726867431.45470: variable 'ansible_timeout' from source: unknown 23826 1726867431.45475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.45734: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867431.45737: variable 'omit' from source: magic vars 23826 1726867431.45740: starting attempt loop 23826 1726867431.45742: running the handler 23826 1726867431.45744: handler run complete 23826 1726867431.45747: attempt loop complete, returning result 23826 1726867431.45749: _execute() done 23826 1726867431.45751: dumping result to json 23826 1726867431.45753: done dumping result, returning 23826 1726867431.45755: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-a92d-a3ea-000000000018] 23826 1726867431.45758: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000018 23826 1726867431.45816: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000018 23826 1726867431.45819: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 23826 1726867431.45941: no more pending results, returning what we have 23826 1726867431.45945: results queue empty 23826 1726867431.45946: checking for any_errors_fatal 23826 1726867431.45955: done checking for any_errors_fatal 23826 1726867431.45956: checking for max_fail_percentage 23826 1726867431.45957: done checking for max_fail_percentage 23826 1726867431.45958: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.45960: done checking to see if all hosts have failed 23826 1726867431.45961: getting the remaining hosts for this loop 23826 1726867431.45962: done getting the remaining hosts for this loop 23826 1726867431.45966: getting the next task for host managed_node2 23826 1726867431.45973: done getting next task for host managed_node2 23826 1726867431.45979: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 23826 1726867431.45983: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.46169: getting variables 23826 1726867431.46171: in VariableManager get_vars() 23826 1726867431.46206: Calling all_inventory to load vars for managed_node2 23826 1726867431.46211: Calling groups_inventory to load vars for managed_node2 23826 1726867431.46214: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.46221: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.46223: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.46226: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.47125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.47988: done with get_vars() 23826 1726867431.48002: done getting variables 23826 1726867431.48045: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:23:51 -0400 (0:00:00.039) 0:00:13.492 ****** 23826 1726867431.48068: entering _queue_task() for managed_node2/fail 23826 1726867431.48252: worker is 1 (out of 1 available) 23826 1726867431.48265: exiting _queue_task() for managed_node2/fail 23826 1726867431.48276: done queuing things up, now waiting for results queue to drain 23826 1726867431.48278: waiting for pending results... 23826 1726867431.48438: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 23826 1726867431.48518: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000019 23826 1726867431.48530: variable 'ansible_search_path' from source: unknown 23826 1726867431.48534: variable 'ansible_search_path' from source: unknown 23826 1726867431.48561: calling self._execute() 23826 1726867431.48630: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.48636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.48644: variable 'omit' from source: magic vars 23826 1726867431.48911: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.48918: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.49000: variable 'network_state' from source: role '' defaults 23826 1726867431.49011: Evaluated conditional (network_state != {}): False 23826 1726867431.49014: when evaluation is False, skipping this task 23826 1726867431.49018: _execute() done 23826 1726867431.49020: dumping result to json 23826 1726867431.49023: done dumping result, returning 23826 1726867431.49026: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-a92d-a3ea-000000000019] 23826 1726867431.49031: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000019 23826 1726867431.49114: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000019 23826 1726867431.49116: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867431.49193: no more pending results, returning what we have 23826 1726867431.49196: results queue empty 23826 1726867431.49197: checking for any_errors_fatal 23826 1726867431.49202: done checking for any_errors_fatal 23826 1726867431.49202: checking for max_fail_percentage 23826 1726867431.49204: done checking for max_fail_percentage 23826 1726867431.49204: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.49205: done checking to see if all hosts have failed 23826 1726867431.49206: getting the remaining hosts for this loop 23826 1726867431.49210: done getting the remaining hosts for this loop 23826 1726867431.49212: getting the next task for host managed_node2 23826 1726867431.49217: done getting next task for host managed_node2 23826 1726867431.49221: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 23826 1726867431.49223: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.49237: getting variables 23826 1726867431.49238: in VariableManager get_vars() 23826 1726867431.49264: Calling all_inventory to load vars for managed_node2 23826 1726867431.49266: Calling groups_inventory to load vars for managed_node2 23826 1726867431.49267: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.49273: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.49275: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.49276: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.50106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.50957: done with get_vars() 23826 1726867431.50971: done getting variables 23826 1726867431.51017: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:23:51 -0400 (0:00:00.029) 0:00:13.521 ****** 23826 1726867431.51039: entering _queue_task() for managed_node2/fail 23826 1726867431.51235: worker is 1 (out of 1 available) 23826 1726867431.51249: exiting _queue_task() for managed_node2/fail 23826 1726867431.51262: done queuing things up, now waiting for results queue to drain 23826 1726867431.51263: waiting for pending results... 23826 1726867431.51420: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 23826 1726867431.51497: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000001a 23826 1726867431.51511: variable 'ansible_search_path' from source: unknown 23826 1726867431.51515: variable 'ansible_search_path' from source: unknown 23826 1726867431.51540: calling self._execute() 23826 1726867431.51605: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.51613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.51619: variable 'omit' from source: magic vars 23826 1726867431.51878: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.51888: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.51968: variable 'network_state' from source: role '' defaults 23826 1726867431.51976: Evaluated conditional (network_state != {}): False 23826 1726867431.51980: when evaluation is False, skipping this task 23826 1726867431.51983: _execute() done 23826 1726867431.51986: dumping result to json 23826 1726867431.51988: done dumping result, returning 23826 1726867431.51996: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-a92d-a3ea-00000000001a] 23826 1726867431.51999: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001a skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867431.52133: no more pending results, returning what we have 23826 1726867431.52136: results queue empty 23826 1726867431.52137: checking for any_errors_fatal 23826 1726867431.52143: done checking for any_errors_fatal 23826 1726867431.52143: checking for max_fail_percentage 23826 1726867431.52145: done checking for max_fail_percentage 23826 1726867431.52146: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.52147: done checking to see if all hosts have failed 23826 1726867431.52147: getting the remaining hosts for this loop 23826 1726867431.52149: done getting the remaining hosts for this loop 23826 1726867431.52152: getting the next task for host managed_node2 23826 1726867431.52158: done getting next task for host managed_node2 23826 1726867431.52161: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 23826 1726867431.52164: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.52179: getting variables 23826 1726867431.52180: in VariableManager get_vars() 23826 1726867431.52212: Calling all_inventory to load vars for managed_node2 23826 1726867431.52215: Calling groups_inventory to load vars for managed_node2 23826 1726867431.52217: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.52224: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.52226: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.52229: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.52237: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001a 23826 1726867431.52241: WORKER PROCESS EXITING 23826 1726867431.52960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.53817: done with get_vars() 23826 1726867431.53832: done getting variables 23826 1726867431.53873: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:23:51 -0400 (0:00:00.028) 0:00:13.550 ****** 23826 1726867431.53897: entering _queue_task() for managed_node2/fail 23826 1726867431.54089: worker is 1 (out of 1 available) 23826 1726867431.54102: exiting _queue_task() for managed_node2/fail 23826 1726867431.54116: done queuing things up, now waiting for results queue to drain 23826 1726867431.54118: waiting for pending results... 23826 1726867431.54268: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 23826 1726867431.54352: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000001b 23826 1726867431.54363: variable 'ansible_search_path' from source: unknown 23826 1726867431.54366: variable 'ansible_search_path' from source: unknown 23826 1726867431.54396: calling self._execute() 23826 1726867431.54457: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.54466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.54474: variable 'omit' from source: magic vars 23826 1726867431.54740: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.54749: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.54870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867431.56565: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867431.56608: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867431.56638: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867431.56673: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867431.56694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867431.56755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.56775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.56794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.56823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.56834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.56900: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.56916: Evaluated conditional (ansible_distribution_major_version | int > 9): True 23826 1726867431.56994: variable 'ansible_distribution' from source: facts 23826 1726867431.56997: variable '__network_rh_distros' from source: role '' defaults 23826 1726867431.57005: Evaluated conditional (ansible_distribution in __network_rh_distros): True 23826 1726867431.57158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.57181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.57198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.57225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.57237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.57270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.57291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.57307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.57333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.57343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.57370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.57394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.57409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.57436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.57446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.57633: variable 'network_connections' from source: task vars 23826 1726867431.57641: variable 'interface' from source: set_fact 23826 1726867431.57691: variable 'interface' from source: set_fact 23826 1726867431.57699: variable 'interface' from source: set_fact 23826 1726867431.57746: variable 'interface' from source: set_fact 23826 1726867431.57753: variable 'network_state' from source: role '' defaults 23826 1726867431.57799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867431.57906: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867431.57937: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867431.57958: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867431.57980: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867431.58021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867431.58045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867431.58062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.58081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867431.58108: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 23826 1726867431.58111: when evaluation is False, skipping this task 23826 1726867431.58115: _execute() done 23826 1726867431.58118: dumping result to json 23826 1726867431.58120: done dumping result, returning 23826 1726867431.58128: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-a92d-a3ea-00000000001b] 23826 1726867431.58133: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001b 23826 1726867431.58210: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001b 23826 1726867431.58213: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 23826 1726867431.58297: no more pending results, returning what we have 23826 1726867431.58300: results queue empty 23826 1726867431.58301: checking for any_errors_fatal 23826 1726867431.58305: done checking for any_errors_fatal 23826 1726867431.58306: checking for max_fail_percentage 23826 1726867431.58308: done checking for max_fail_percentage 23826 1726867431.58308: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.58309: done checking to see if all hosts have failed 23826 1726867431.58310: getting the remaining hosts for this loop 23826 1726867431.58311: done getting the remaining hosts for this loop 23826 1726867431.58315: getting the next task for host managed_node2 23826 1726867431.58321: done getting next task for host managed_node2 23826 1726867431.58324: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 23826 1726867431.58327: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.58339: getting variables 23826 1726867431.58340: in VariableManager get_vars() 23826 1726867431.58370: Calling all_inventory to load vars for managed_node2 23826 1726867431.58373: Calling groups_inventory to load vars for managed_node2 23826 1726867431.58375: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.58384: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.58386: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.58389: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.59222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.60063: done with get_vars() 23826 1726867431.60081: done getting variables 23826 1726867431.60146: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:23:51 -0400 (0:00:00.062) 0:00:13.613 ****** 23826 1726867431.60167: entering _queue_task() for managed_node2/dnf 23826 1726867431.60367: worker is 1 (out of 1 available) 23826 1726867431.60382: exiting _queue_task() for managed_node2/dnf 23826 1726867431.60394: done queuing things up, now waiting for results queue to drain 23826 1726867431.60395: waiting for pending results... 23826 1726867431.60556: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 23826 1726867431.60642: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000001c 23826 1726867431.60655: variable 'ansible_search_path' from source: unknown 23826 1726867431.60659: variable 'ansible_search_path' from source: unknown 23826 1726867431.60688: calling self._execute() 23826 1726867431.60757: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.60761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.60770: variable 'omit' from source: magic vars 23826 1726867431.61028: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.61037: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.61165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867431.62662: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867431.62692: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867431.62723: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867431.62748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867431.62768: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867431.62831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.62850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.62869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.62896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.62909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.62989: variable 'ansible_distribution' from source: facts 23826 1726867431.62993: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.63005: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 23826 1726867431.63082: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867431.63168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.63186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.63202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.63231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.63244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.63271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.63288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.63304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.63331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.63343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.63371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.63389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.63404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.63431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.63441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.63542: variable 'network_connections' from source: task vars 23826 1726867431.63551: variable 'interface' from source: set_fact 23826 1726867431.63601: variable 'interface' from source: set_fact 23826 1726867431.63608: variable 'interface' from source: set_fact 23826 1726867431.63653: variable 'interface' from source: set_fact 23826 1726867431.63702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867431.63810: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867431.63837: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867431.63870: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867431.63895: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867431.63927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867431.63943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867431.63964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.63982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867431.64028: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867431.64282: variable 'network_connections' from source: task vars 23826 1726867431.64285: variable 'interface' from source: set_fact 23826 1726867431.64287: variable 'interface' from source: set_fact 23826 1726867431.64290: variable 'interface' from source: set_fact 23826 1726867431.64348: variable 'interface' from source: set_fact 23826 1726867431.64383: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867431.64392: when evaluation is False, skipping this task 23826 1726867431.64399: _execute() done 23826 1726867431.64406: dumping result to json 23826 1726867431.64418: done dumping result, returning 23826 1726867431.64431: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000001c] 23826 1726867431.64440: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001c skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867431.64586: no more pending results, returning what we have 23826 1726867431.64589: results queue empty 23826 1726867431.64590: checking for any_errors_fatal 23826 1726867431.64595: done checking for any_errors_fatal 23826 1726867431.64595: checking for max_fail_percentage 23826 1726867431.64597: done checking for max_fail_percentage 23826 1726867431.64598: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.64599: done checking to see if all hosts have failed 23826 1726867431.64599: getting the remaining hosts for this loop 23826 1726867431.64601: done getting the remaining hosts for this loop 23826 1726867431.64604: getting the next task for host managed_node2 23826 1726867431.64611: done getting next task for host managed_node2 23826 1726867431.64614: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 23826 1726867431.64617: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.64785: getting variables 23826 1726867431.64787: in VariableManager get_vars() 23826 1726867431.64820: Calling all_inventory to load vars for managed_node2 23826 1726867431.64823: Calling groups_inventory to load vars for managed_node2 23826 1726867431.64825: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.64833: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.64836: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.64839: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.65357: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001c 23826 1726867431.65361: WORKER PROCESS EXITING 23826 1726867431.65855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.66806: done with get_vars() 23826 1726867431.66823: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 23826 1726867431.66875: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:23:51 -0400 (0:00:00.067) 0:00:13.680 ****** 23826 1726867431.66897: entering _queue_task() for managed_node2/yum 23826 1726867431.66899: Creating lock for yum 23826 1726867431.67111: worker is 1 (out of 1 available) 23826 1726867431.67128: exiting _queue_task() for managed_node2/yum 23826 1726867431.67140: done queuing things up, now waiting for results queue to drain 23826 1726867431.67141: waiting for pending results... 23826 1726867431.67335: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 23826 1726867431.67467: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000001d 23826 1726867431.67489: variable 'ansible_search_path' from source: unknown 23826 1726867431.67496: variable 'ansible_search_path' from source: unknown 23826 1726867431.67539: calling self._execute() 23826 1726867431.67633: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.67646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.67663: variable 'omit' from source: magic vars 23826 1726867431.68033: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.68048: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.68230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867431.70586: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867431.70674: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867431.70727: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867431.70766: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867431.70798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867431.70890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.70927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.70965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.71015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.71036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.71149: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.71181: Evaluated conditional (ansible_distribution_major_version | int < 8): False 23826 1726867431.71270: when evaluation is False, skipping this task 23826 1726867431.71273: _execute() done 23826 1726867431.71276: dumping result to json 23826 1726867431.71282: done dumping result, returning 23826 1726867431.71285: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000001d] 23826 1726867431.71287: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001d 23826 1726867431.71363: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001d 23826 1726867431.71366: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 23826 1726867431.71425: no more pending results, returning what we have 23826 1726867431.71429: results queue empty 23826 1726867431.71430: checking for any_errors_fatal 23826 1726867431.71435: done checking for any_errors_fatal 23826 1726867431.71436: checking for max_fail_percentage 23826 1726867431.71438: done checking for max_fail_percentage 23826 1726867431.71438: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.71439: done checking to see if all hosts have failed 23826 1726867431.71440: getting the remaining hosts for this loop 23826 1726867431.71442: done getting the remaining hosts for this loop 23826 1726867431.71445: getting the next task for host managed_node2 23826 1726867431.71453: done getting next task for host managed_node2 23826 1726867431.71456: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 23826 1726867431.71460: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.71475: getting variables 23826 1726867431.71479: in VariableManager get_vars() 23826 1726867431.71521: Calling all_inventory to load vars for managed_node2 23826 1726867431.71523: Calling groups_inventory to load vars for managed_node2 23826 1726867431.71526: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.71535: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.71538: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.71541: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.73243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.74954: done with get_vars() 23826 1726867431.74984: done getting variables 23826 1726867431.75171: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:23:51 -0400 (0:00:00.083) 0:00:13.763 ****** 23826 1726867431.75215: entering _queue_task() for managed_node2/fail 23826 1726867431.75817: worker is 1 (out of 1 available) 23826 1726867431.75830: exiting _queue_task() for managed_node2/fail 23826 1726867431.75842: done queuing things up, now waiting for results queue to drain 23826 1726867431.75843: waiting for pending results... 23826 1726867431.76084: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 23826 1726867431.76387: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000001e 23826 1726867431.76391: variable 'ansible_search_path' from source: unknown 23826 1726867431.76396: variable 'ansible_search_path' from source: unknown 23826 1726867431.76440: calling self._execute() 23826 1726867431.76540: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.76553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.76568: variable 'omit' from source: magic vars 23826 1726867431.76954: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.77017: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.77105: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867431.77319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867431.79449: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867431.79502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867431.79531: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867431.79555: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867431.79576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867431.79634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.79656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.79674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.79704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.79715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.79750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.79766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.79786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.79813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.79821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.79850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.79868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.79887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.79918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.79928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.80044: variable 'network_connections' from source: task vars 23826 1726867431.80055: variable 'interface' from source: set_fact 23826 1726867431.80109: variable 'interface' from source: set_fact 23826 1726867431.80119: variable 'interface' from source: set_fact 23826 1726867431.80161: variable 'interface' from source: set_fact 23826 1726867431.80210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867431.80320: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867431.80346: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867431.80379: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867431.80403: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867431.80436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867431.80454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867431.80472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.80492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867431.80540: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867431.80702: variable 'network_connections' from source: task vars 23826 1726867431.80706: variable 'interface' from source: set_fact 23826 1726867431.80754: variable 'interface' from source: set_fact 23826 1726867431.80759: variable 'interface' from source: set_fact 23826 1726867431.80876: variable 'interface' from source: set_fact 23826 1726867431.80881: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867431.80884: when evaluation is False, skipping this task 23826 1726867431.80984: _execute() done 23826 1726867431.80987: dumping result to json 23826 1726867431.80989: done dumping result, returning 23826 1726867431.80991: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000001e] 23826 1726867431.81000: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001e 23826 1726867431.81058: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001e 23826 1726867431.81060: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867431.81126: no more pending results, returning what we have 23826 1726867431.81130: results queue empty 23826 1726867431.81131: checking for any_errors_fatal 23826 1726867431.81137: done checking for any_errors_fatal 23826 1726867431.81138: checking for max_fail_percentage 23826 1726867431.81140: done checking for max_fail_percentage 23826 1726867431.81141: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.81142: done checking to see if all hosts have failed 23826 1726867431.81143: getting the remaining hosts for this loop 23826 1726867431.81144: done getting the remaining hosts for this loop 23826 1726867431.81148: getting the next task for host managed_node2 23826 1726867431.81155: done getting next task for host managed_node2 23826 1726867431.81159: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 23826 1726867431.81162: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.81176: getting variables 23826 1726867431.81180: in VariableManager get_vars() 23826 1726867431.81220: Calling all_inventory to load vars for managed_node2 23826 1726867431.81223: Calling groups_inventory to load vars for managed_node2 23826 1726867431.81226: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.81235: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.81238: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.81240: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.82440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.86103: done with get_vars() 23826 1726867431.86118: done getting variables 23826 1726867431.86149: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:23:51 -0400 (0:00:00.109) 0:00:13.873 ****** 23826 1726867431.86170: entering _queue_task() for managed_node2/package 23826 1726867431.86389: worker is 1 (out of 1 available) 23826 1726867431.86403: exiting _queue_task() for managed_node2/package 23826 1726867431.86413: done queuing things up, now waiting for results queue to drain 23826 1726867431.86415: waiting for pending results... 23826 1726867431.86590: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 23826 1726867431.86705: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000001f 23826 1726867431.86718: variable 'ansible_search_path' from source: unknown 23826 1726867431.86731: variable 'ansible_search_path' from source: unknown 23826 1726867431.86819: calling self._execute() 23826 1726867431.86869: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.86880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.86888: variable 'omit' from source: magic vars 23826 1726867431.87382: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.87386: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.87448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867431.87782: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867431.87785: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867431.87796: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867431.87837: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867431.87980: variable 'network_packages' from source: role '' defaults 23826 1726867431.88096: variable '__network_provider_setup' from source: role '' defaults 23826 1726867431.88114: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867431.88195: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867431.88211: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867431.88272: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867431.88475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867431.89857: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867431.89897: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867431.89939: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867431.89964: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867431.89985: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867431.90042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.90063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.90081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.90107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.90120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.90154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.90170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.90190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.90217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.90228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.90367: variable '__network_packages_default_gobject_packages' from source: role '' defaults 23826 1726867431.90438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.90457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.90504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.90682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.90685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.90688: variable 'ansible_python' from source: facts 23826 1726867431.90689: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 23826 1726867431.90761: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867431.90855: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867431.90986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.91015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.91054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.91097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.91116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.91175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867431.91245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867431.91248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.91293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867431.91313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867431.91482: variable 'network_connections' from source: task vars 23826 1726867431.91510: variable 'interface' from source: set_fact 23826 1726867431.91569: variable 'interface' from source: set_fact 23826 1726867431.91574: variable 'interface' from source: set_fact 23826 1726867431.91646: variable 'interface' from source: set_fact 23826 1726867431.91699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867431.91727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867431.91747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867431.91768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867431.91806: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867431.91984: variable 'network_connections' from source: task vars 23826 1726867431.91988: variable 'interface' from source: set_fact 23826 1726867431.92061: variable 'interface' from source: set_fact 23826 1726867431.92068: variable 'interface' from source: set_fact 23826 1726867431.92141: variable 'interface' from source: set_fact 23826 1726867431.92175: variable '__network_packages_default_wireless' from source: role '' defaults 23826 1726867431.92238: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867431.92430: variable 'network_connections' from source: task vars 23826 1726867431.92433: variable 'interface' from source: set_fact 23826 1726867431.92482: variable 'interface' from source: set_fact 23826 1726867431.92487: variable 'interface' from source: set_fact 23826 1726867431.92533: variable 'interface' from source: set_fact 23826 1726867431.92551: variable '__network_packages_default_team' from source: role '' defaults 23826 1726867431.92607: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867431.92801: variable 'network_connections' from source: task vars 23826 1726867431.92804: variable 'interface' from source: set_fact 23826 1726867431.92851: variable 'interface' from source: set_fact 23826 1726867431.92858: variable 'interface' from source: set_fact 23826 1726867431.92907: variable 'interface' from source: set_fact 23826 1726867431.92950: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867431.92994: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867431.93000: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867431.93044: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867431.93187: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 23826 1726867431.93484: variable 'network_connections' from source: task vars 23826 1726867431.93488: variable 'interface' from source: set_fact 23826 1726867431.93535: variable 'interface' from source: set_fact 23826 1726867431.93538: variable 'interface' from source: set_fact 23826 1726867431.93581: variable 'interface' from source: set_fact 23826 1726867431.93588: variable 'ansible_distribution' from source: facts 23826 1726867431.93591: variable '__network_rh_distros' from source: role '' defaults 23826 1726867431.93597: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.93616: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 23826 1726867431.93724: variable 'ansible_distribution' from source: facts 23826 1726867431.93728: variable '__network_rh_distros' from source: role '' defaults 23826 1726867431.93731: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.93745: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 23826 1726867431.93850: variable 'ansible_distribution' from source: facts 23826 1726867431.93855: variable '__network_rh_distros' from source: role '' defaults 23826 1726867431.93858: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.93884: variable 'network_provider' from source: set_fact 23826 1726867431.93896: variable 'ansible_facts' from source: unknown 23826 1726867431.94320: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 23826 1726867431.94324: when evaluation is False, skipping this task 23826 1726867431.94326: _execute() done 23826 1726867431.94329: dumping result to json 23826 1726867431.94330: done dumping result, returning 23826 1726867431.94337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-a92d-a3ea-00000000001f] 23826 1726867431.94341: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001f 23826 1726867431.94429: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000001f 23826 1726867431.94432: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 23826 1726867431.94479: no more pending results, returning what we have 23826 1726867431.94482: results queue empty 23826 1726867431.94482: checking for any_errors_fatal 23826 1726867431.94490: done checking for any_errors_fatal 23826 1726867431.94491: checking for max_fail_percentage 23826 1726867431.94494: done checking for max_fail_percentage 23826 1726867431.94495: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.94496: done checking to see if all hosts have failed 23826 1726867431.94496: getting the remaining hosts for this loop 23826 1726867431.94497: done getting the remaining hosts for this loop 23826 1726867431.94501: getting the next task for host managed_node2 23826 1726867431.94508: done getting next task for host managed_node2 23826 1726867431.94512: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 23826 1726867431.94515: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.94532: getting variables 23826 1726867431.94534: in VariableManager get_vars() 23826 1726867431.94569: Calling all_inventory to load vars for managed_node2 23826 1726867431.94572: Calling groups_inventory to load vars for managed_node2 23826 1726867431.94574: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.94590: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.94593: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.94596: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.95365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.96230: done with get_vars() 23826 1726867431.96244: done getting variables 23826 1726867431.96285: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:23:51 -0400 (0:00:00.101) 0:00:13.974 ****** 23826 1726867431.96307: entering _queue_task() for managed_node2/package 23826 1726867431.96508: worker is 1 (out of 1 available) 23826 1726867431.96521: exiting _queue_task() for managed_node2/package 23826 1726867431.96532: done queuing things up, now waiting for results queue to drain 23826 1726867431.96534: waiting for pending results... 23826 1726867431.96713: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 23826 1726867431.96792: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000020 23826 1726867431.96804: variable 'ansible_search_path' from source: unknown 23826 1726867431.96810: variable 'ansible_search_path' from source: unknown 23826 1726867431.96835: calling self._execute() 23826 1726867431.96906: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.96912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.96919: variable 'omit' from source: magic vars 23826 1726867431.97185: variable 'ansible_distribution_major_version' from source: facts 23826 1726867431.97199: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867431.97280: variable 'network_state' from source: role '' defaults 23826 1726867431.97288: Evaluated conditional (network_state != {}): False 23826 1726867431.97291: when evaluation is False, skipping this task 23826 1726867431.97294: _execute() done 23826 1726867431.97298: dumping result to json 23826 1726867431.97301: done dumping result, returning 23826 1726867431.97315: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-a92d-a3ea-000000000020] 23826 1726867431.97319: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000020 23826 1726867431.97396: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000020 23826 1726867431.97399: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867431.97457: no more pending results, returning what we have 23826 1726867431.97460: results queue empty 23826 1726867431.97461: checking for any_errors_fatal 23826 1726867431.97466: done checking for any_errors_fatal 23826 1726867431.97466: checking for max_fail_percentage 23826 1726867431.97468: done checking for max_fail_percentage 23826 1726867431.97468: checking to see if all hosts have failed and the running result is not ok 23826 1726867431.97469: done checking to see if all hosts have failed 23826 1726867431.97470: getting the remaining hosts for this loop 23826 1726867431.97471: done getting the remaining hosts for this loop 23826 1726867431.97474: getting the next task for host managed_node2 23826 1726867431.97481: done getting next task for host managed_node2 23826 1726867431.97485: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 23826 1726867431.97487: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867431.97499: getting variables 23826 1726867431.97500: in VariableManager get_vars() 23826 1726867431.97533: Calling all_inventory to load vars for managed_node2 23826 1726867431.97536: Calling groups_inventory to load vars for managed_node2 23826 1726867431.97538: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867431.97545: Calling all_plugins_play to load vars for managed_node2 23826 1726867431.97547: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867431.97550: Calling groups_plugins_play to load vars for managed_node2 23826 1726867431.98379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867431.99231: done with get_vars() 23826 1726867431.99244: done getting variables 23826 1726867431.99286: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:23:51 -0400 (0:00:00.029) 0:00:14.004 ****** 23826 1726867431.99309: entering _queue_task() for managed_node2/package 23826 1726867431.99483: worker is 1 (out of 1 available) 23826 1726867431.99495: exiting _queue_task() for managed_node2/package 23826 1726867431.99506: done queuing things up, now waiting for results queue to drain 23826 1726867431.99510: waiting for pending results... 23826 1726867431.99657: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 23826 1726867431.99739: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000021 23826 1726867431.99746: variable 'ansible_search_path' from source: unknown 23826 1726867431.99749: variable 'ansible_search_path' from source: unknown 23826 1726867431.99774: calling self._execute() 23826 1726867431.99835: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867431.99843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867431.99852: variable 'omit' from source: magic vars 23826 1726867432.00099: variable 'ansible_distribution_major_version' from source: facts 23826 1726867432.00110: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867432.00191: variable 'network_state' from source: role '' defaults 23826 1726867432.00199: Evaluated conditional (network_state != {}): False 23826 1726867432.00201: when evaluation is False, skipping this task 23826 1726867432.00204: _execute() done 23826 1726867432.00209: dumping result to json 23826 1726867432.00212: done dumping result, returning 23826 1726867432.00216: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-a92d-a3ea-000000000021] 23826 1726867432.00221: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000021 23826 1726867432.00311: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000021 23826 1726867432.00314: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867432.00358: no more pending results, returning what we have 23826 1726867432.00361: results queue empty 23826 1726867432.00362: checking for any_errors_fatal 23826 1726867432.00366: done checking for any_errors_fatal 23826 1726867432.00367: checking for max_fail_percentage 23826 1726867432.00368: done checking for max_fail_percentage 23826 1726867432.00369: checking to see if all hosts have failed and the running result is not ok 23826 1726867432.00370: done checking to see if all hosts have failed 23826 1726867432.00370: getting the remaining hosts for this loop 23826 1726867432.00371: done getting the remaining hosts for this loop 23826 1726867432.00374: getting the next task for host managed_node2 23826 1726867432.00381: done getting next task for host managed_node2 23826 1726867432.00385: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 23826 1726867432.00387: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867432.00400: getting variables 23826 1726867432.00401: in VariableManager get_vars() 23826 1726867432.00438: Calling all_inventory to load vars for managed_node2 23826 1726867432.00440: Calling groups_inventory to load vars for managed_node2 23826 1726867432.00441: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867432.00447: Calling all_plugins_play to load vars for managed_node2 23826 1726867432.00449: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867432.00450: Calling groups_plugins_play to load vars for managed_node2 23826 1726867432.01162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867432.02030: done with get_vars() 23826 1726867432.02043: done getting variables 23826 1726867432.02113: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:23:52 -0400 (0:00:00.028) 0:00:14.032 ****** 23826 1726867432.02133: entering _queue_task() for managed_node2/service 23826 1726867432.02134: Creating lock for service 23826 1726867432.02312: worker is 1 (out of 1 available) 23826 1726867432.02324: exiting _queue_task() for managed_node2/service 23826 1726867432.02334: done queuing things up, now waiting for results queue to drain 23826 1726867432.02336: waiting for pending results... 23826 1726867432.02485: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 23826 1726867432.02553: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000022 23826 1726867432.02565: variable 'ansible_search_path' from source: unknown 23826 1726867432.02572: variable 'ansible_search_path' from source: unknown 23826 1726867432.02601: calling self._execute() 23826 1726867432.02672: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867432.02676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867432.02687: variable 'omit' from source: magic vars 23826 1726867432.02943: variable 'ansible_distribution_major_version' from source: facts 23826 1726867432.02951: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867432.03036: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867432.03163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867432.04808: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867432.04858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867432.04891: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867432.04918: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867432.04940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867432.04996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867432.05019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867432.05036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.05061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867432.05075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867432.05108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867432.05126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867432.05143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.05166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867432.05176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867432.05215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867432.05231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867432.05247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.05272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867432.05285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867432.05390: variable 'network_connections' from source: task vars 23826 1726867432.05403: variable 'interface' from source: set_fact 23826 1726867432.05450: variable 'interface' from source: set_fact 23826 1726867432.05458: variable 'interface' from source: set_fact 23826 1726867432.05500: variable 'interface' from source: set_fact 23826 1726867432.05550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867432.05658: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867432.05685: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867432.05707: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867432.05741: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867432.05770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867432.05787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867432.05805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.05825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867432.05871: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867432.06018: variable 'network_connections' from source: task vars 23826 1726867432.06022: variable 'interface' from source: set_fact 23826 1726867432.06067: variable 'interface' from source: set_fact 23826 1726867432.06073: variable 'interface' from source: set_fact 23826 1726867432.06117: variable 'interface' from source: set_fact 23826 1726867432.06140: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867432.06143: when evaluation is False, skipping this task 23826 1726867432.06145: _execute() done 23826 1726867432.06148: dumping result to json 23826 1726867432.06150: done dumping result, returning 23826 1726867432.06162: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-000000000022] 23826 1726867432.06171: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000022 23826 1726867432.06242: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000022 23826 1726867432.06244: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867432.06309: no more pending results, returning what we have 23826 1726867432.06313: results queue empty 23826 1726867432.06313: checking for any_errors_fatal 23826 1726867432.06321: done checking for any_errors_fatal 23826 1726867432.06321: checking for max_fail_percentage 23826 1726867432.06323: done checking for max_fail_percentage 23826 1726867432.06324: checking to see if all hosts have failed and the running result is not ok 23826 1726867432.06325: done checking to see if all hosts have failed 23826 1726867432.06325: getting the remaining hosts for this loop 23826 1726867432.06327: done getting the remaining hosts for this loop 23826 1726867432.06330: getting the next task for host managed_node2 23826 1726867432.06336: done getting next task for host managed_node2 23826 1726867432.06339: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 23826 1726867432.06342: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867432.06355: getting variables 23826 1726867432.06356: in VariableManager get_vars() 23826 1726867432.06390: Calling all_inventory to load vars for managed_node2 23826 1726867432.06392: Calling groups_inventory to load vars for managed_node2 23826 1726867432.06394: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867432.06402: Calling all_plugins_play to load vars for managed_node2 23826 1726867432.06404: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867432.06406: Calling groups_plugins_play to load vars for managed_node2 23826 1726867432.07252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867432.08108: done with get_vars() 23826 1726867432.08123: done getting variables 23826 1726867432.08162: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:23:52 -0400 (0:00:00.060) 0:00:14.093 ****** 23826 1726867432.08185: entering _queue_task() for managed_node2/service 23826 1726867432.08398: worker is 1 (out of 1 available) 23826 1726867432.08412: exiting _queue_task() for managed_node2/service 23826 1726867432.08424: done queuing things up, now waiting for results queue to drain 23826 1726867432.08426: waiting for pending results... 23826 1726867432.08599: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 23826 1726867432.08684: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000023 23826 1726867432.08696: variable 'ansible_search_path' from source: unknown 23826 1726867432.08699: variable 'ansible_search_path' from source: unknown 23826 1726867432.08729: calling self._execute() 23826 1726867432.08797: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867432.08801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867432.08809: variable 'omit' from source: magic vars 23826 1726867432.09073: variable 'ansible_distribution_major_version' from source: facts 23826 1726867432.09083: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867432.09192: variable 'network_provider' from source: set_fact 23826 1726867432.09198: variable 'network_state' from source: role '' defaults 23826 1726867432.09208: Evaluated conditional (network_provider == "nm" or network_state != {}): True 23826 1726867432.09213: variable 'omit' from source: magic vars 23826 1726867432.09252: variable 'omit' from source: magic vars 23826 1726867432.09273: variable 'network_service_name' from source: role '' defaults 23826 1726867432.09329: variable 'network_service_name' from source: role '' defaults 23826 1726867432.09403: variable '__network_provider_setup' from source: role '' defaults 23826 1726867432.09407: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867432.09454: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867432.09462: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867432.09506: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867432.09656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867432.11070: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867432.11122: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867432.11154: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867432.11180: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867432.11199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867432.11256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867432.11280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867432.11298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.11326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867432.11337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867432.11369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867432.11389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867432.11406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.11433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867432.11443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867432.11582: variable '__network_packages_default_gobject_packages' from source: role '' defaults 23826 1726867432.11659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867432.11675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867432.11693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.11725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867432.11735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867432.11795: variable 'ansible_python' from source: facts 23826 1726867432.11815: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 23826 1726867432.11867: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867432.11924: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867432.12003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867432.12025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867432.12043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.12068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867432.12080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867432.12112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867432.12132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867432.12153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.12178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867432.12189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867432.12280: variable 'network_connections' from source: task vars 23826 1726867432.12287: variable 'interface' from source: set_fact 23826 1726867432.12339: variable 'interface' from source: set_fact 23826 1726867432.12348: variable 'interface' from source: set_fact 23826 1726867432.12401: variable 'interface' from source: set_fact 23826 1726867432.12474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867432.12598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867432.12634: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867432.12663: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867432.12697: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867432.12741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867432.12761: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867432.12784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867432.12812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867432.12846: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867432.13022: variable 'network_connections' from source: task vars 23826 1726867432.13028: variable 'interface' from source: set_fact 23826 1726867432.13079: variable 'interface' from source: set_fact 23826 1726867432.13088: variable 'interface' from source: set_fact 23826 1726867432.13142: variable 'interface' from source: set_fact 23826 1726867432.13174: variable '__network_packages_default_wireless' from source: role '' defaults 23826 1726867432.13235: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867432.13414: variable 'network_connections' from source: task vars 23826 1726867432.13418: variable 'interface' from source: set_fact 23826 1726867432.13468: variable 'interface' from source: set_fact 23826 1726867432.13474: variable 'interface' from source: set_fact 23826 1726867432.13525: variable 'interface' from source: set_fact 23826 1726867432.13542: variable '__network_packages_default_team' from source: role '' defaults 23826 1726867432.13598: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867432.13778: variable 'network_connections' from source: task vars 23826 1726867432.13786: variable 'interface' from source: set_fact 23826 1726867432.13835: variable 'interface' from source: set_fact 23826 1726867432.13840: variable 'interface' from source: set_fact 23826 1726867432.13893: variable 'interface' from source: set_fact 23826 1726867432.13934: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867432.13975: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867432.13982: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867432.14026: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867432.14164: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 23826 1726867432.14475: variable 'network_connections' from source: task vars 23826 1726867432.14479: variable 'interface' from source: set_fact 23826 1726867432.14522: variable 'interface' from source: set_fact 23826 1726867432.14528: variable 'interface' from source: set_fact 23826 1726867432.14572: variable 'interface' from source: set_fact 23826 1726867432.14580: variable 'ansible_distribution' from source: facts 23826 1726867432.14583: variable '__network_rh_distros' from source: role '' defaults 23826 1726867432.14589: variable 'ansible_distribution_major_version' from source: facts 23826 1726867432.14605: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 23826 1726867432.14717: variable 'ansible_distribution' from source: facts 23826 1726867432.14721: variable '__network_rh_distros' from source: role '' defaults 23826 1726867432.14725: variable 'ansible_distribution_major_version' from source: facts 23826 1726867432.14736: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 23826 1726867432.14846: variable 'ansible_distribution' from source: facts 23826 1726867432.14850: variable '__network_rh_distros' from source: role '' defaults 23826 1726867432.14855: variable 'ansible_distribution_major_version' from source: facts 23826 1726867432.14884: variable 'network_provider' from source: set_fact 23826 1726867432.14899: variable 'omit' from source: magic vars 23826 1726867432.14918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867432.14938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867432.14952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867432.14964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867432.14974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867432.15000: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867432.15003: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867432.15006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867432.15068: Set connection var ansible_timeout to 10 23826 1726867432.15075: Set connection var ansible_shell_executable to /bin/sh 23826 1726867432.15083: Set connection var ansible_connection to ssh 23826 1726867432.15092: Set connection var ansible_pipelining to False 23826 1726867432.15095: Set connection var ansible_shell_type to sh 23826 1726867432.15097: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867432.15116: variable 'ansible_shell_executable' from source: unknown 23826 1726867432.15119: variable 'ansible_connection' from source: unknown 23826 1726867432.15122: variable 'ansible_module_compression' from source: unknown 23826 1726867432.15124: variable 'ansible_shell_type' from source: unknown 23826 1726867432.15126: variable 'ansible_shell_executable' from source: unknown 23826 1726867432.15128: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867432.15133: variable 'ansible_pipelining' from source: unknown 23826 1726867432.15135: variable 'ansible_timeout' from source: unknown 23826 1726867432.15138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867432.15208: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867432.15219: variable 'omit' from source: magic vars 23826 1726867432.15227: starting attempt loop 23826 1726867432.15238: running the handler 23826 1726867432.15304: variable 'ansible_facts' from source: unknown 23826 1726867432.15831: _low_level_execute_command(): starting 23826 1726867432.15835: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867432.16332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867432.16335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867432.16339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867432.16342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867432.16383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867432.16404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867432.16410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867432.16452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867432.18153: stdout chunk (state=3): >>>/root <<< 23826 1726867432.18254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867432.18276: stderr chunk (state=3): >>><<< 23826 1726867432.18284: stdout chunk (state=3): >>><<< 23826 1726867432.18300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867432.18313: _low_level_execute_command(): starting 23826 1726867432.18316: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682 `" && echo ansible-tmp-1726867432.1830034-24606-33296156254682="` echo /root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682 `" ) && sleep 0' 23826 1726867432.18725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867432.18728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867432.18731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867432.18733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867432.18784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867432.18791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867432.18831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867432.20748: stdout chunk (state=3): >>>ansible-tmp-1726867432.1830034-24606-33296156254682=/root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682 <<< 23826 1726867432.20885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867432.20903: stdout chunk (state=3): >>><<< 23826 1726867432.20906: stderr chunk (state=3): >>><<< 23826 1726867432.20919: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867432.1830034-24606-33296156254682=/root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867432.21082: variable 'ansible_module_compression' from source: unknown 23826 1726867432.21087: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 23826 1726867432.21090: ANSIBALLZ: Acquiring lock 23826 1726867432.21092: ANSIBALLZ: Lock acquired: 139851310993328 23826 1726867432.21094: ANSIBALLZ: Creating module 23826 1726867432.52736: ANSIBALLZ: Writing module into payload 23826 1726867432.52846: ANSIBALLZ: Writing module 23826 1726867432.52874: ANSIBALLZ: Renaming module 23826 1726867432.52886: ANSIBALLZ: Done creating module 23826 1726867432.52901: variable 'ansible_facts' from source: unknown 23826 1726867432.53124: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/AnsiballZ_systemd.py 23826 1726867432.53273: Sending initial data 23826 1726867432.53276: Sent initial data (155 bytes) 23826 1726867432.53895: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867432.53898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867432.53900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867432.53903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867432.53906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867432.53943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867432.53947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867432.53958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867432.54038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867432.55682: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867432.55731: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867432.55770: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp1lfp7wqf /root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/AnsiballZ_systemd.py <<< 23826 1726867432.55773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/AnsiballZ_systemd.py" <<< 23826 1726867432.55812: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp1lfp7wqf" to remote "/root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/AnsiballZ_systemd.py" <<< 23826 1726867432.57341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867432.57345: stdout chunk (state=3): >>><<< 23826 1726867432.57347: stderr chunk (state=3): >>><<< 23826 1726867432.57349: done transferring module to remote 23826 1726867432.57351: _low_level_execute_command(): starting 23826 1726867432.57353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/ /root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/AnsiballZ_systemd.py && sleep 0' 23826 1726867432.57921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867432.57938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867432.57950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867432.57972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867432.58044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867432.59907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867432.59990: stdout chunk (state=3): >>><<< 23826 1726867432.60004: stderr chunk (state=3): >>><<< 23826 1726867432.60100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867432.60103: _low_level_execute_command(): starting 23826 1726867432.60106: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/AnsiballZ_systemd.py && sleep 0' 23826 1726867432.60753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867432.60774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867432.60793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867432.60811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867432.60896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867432.60931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867432.60946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867432.60966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867432.61048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867432.90568: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4542464", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318030336", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1073680000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 23826 1726867432.90619: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 23826 1726867432.92688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867432.92692: stdout chunk (state=3): >>><<< 23826 1726867432.92694: stderr chunk (state=3): >>><<< 23826 1726867432.92698: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4542464", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318030336", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1073680000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867432.92829: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867432.92841: _low_level_execute_command(): starting 23826 1726867432.92850: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867432.1830034-24606-33296156254682/ > /dev/null 2>&1 && sleep 0' 23826 1726867432.93487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867432.93499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867432.93568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867432.93627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867432.93648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867432.93717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867432.95604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867432.95620: stdout chunk (state=3): >>><<< 23826 1726867432.95637: stderr chunk (state=3): >>><<< 23826 1726867432.95655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867432.95667: handler run complete 23826 1726867432.95738: attempt loop complete, returning result 23826 1726867432.95749: _execute() done 23826 1726867432.95756: dumping result to json 23826 1726867432.95782: done dumping result, returning 23826 1726867432.95844: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-a92d-a3ea-000000000023] 23826 1726867432.95849: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000023 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867432.96230: no more pending results, returning what we have 23826 1726867432.96234: results queue empty 23826 1726867432.96235: checking for any_errors_fatal 23826 1726867432.96241: done checking for any_errors_fatal 23826 1726867432.96242: checking for max_fail_percentage 23826 1726867432.96244: done checking for max_fail_percentage 23826 1726867432.96245: checking to see if all hosts have failed and the running result is not ok 23826 1726867432.96246: done checking to see if all hosts have failed 23826 1726867432.96247: getting the remaining hosts for this loop 23826 1726867432.96248: done getting the remaining hosts for this loop 23826 1726867432.96252: getting the next task for host managed_node2 23826 1726867432.96259: done getting next task for host managed_node2 23826 1726867432.96262: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 23826 1726867432.96265: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867432.96481: getting variables 23826 1726867432.96483: in VariableManager get_vars() 23826 1726867432.96520: Calling all_inventory to load vars for managed_node2 23826 1726867432.96525: Calling groups_inventory to load vars for managed_node2 23826 1726867432.96527: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867432.96537: Calling all_plugins_play to load vars for managed_node2 23826 1726867432.96540: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867432.96542: Calling groups_plugins_play to load vars for managed_node2 23826 1726867432.97125: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000023 23826 1726867432.97130: WORKER PROCESS EXITING 23826 1726867432.98213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867432.99803: done with get_vars() 23826 1726867432.99824: done getting variables 23826 1726867432.99890: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:23:52 -0400 (0:00:00.917) 0:00:15.010 ****** 23826 1726867432.99922: entering _queue_task() for managed_node2/service 23826 1726867433.00325: worker is 1 (out of 1 available) 23826 1726867433.00335: exiting _queue_task() for managed_node2/service 23826 1726867433.00346: done queuing things up, now waiting for results queue to drain 23826 1726867433.00347: waiting for pending results... 23826 1726867433.00536: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 23826 1726867433.00669: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000024 23826 1726867433.00696: variable 'ansible_search_path' from source: unknown 23826 1726867433.00705: variable 'ansible_search_path' from source: unknown 23826 1726867433.00766: calling self._execute() 23826 1726867433.00870: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867433.00885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867433.00904: variable 'omit' from source: magic vars 23826 1726867433.01312: variable 'ansible_distribution_major_version' from source: facts 23826 1726867433.01343: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867433.01499: variable 'network_provider' from source: set_fact 23826 1726867433.01608: Evaluated conditional (network_provider == "nm"): True 23826 1726867433.01612: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867433.01699: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867433.01907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867433.05985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867433.06057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867433.06153: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867433.06251: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867433.06359: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867433.06589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867433.06627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867433.06774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867433.06830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867433.06865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867433.06995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867433.07021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867433.07052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867433.07106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867433.07201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867433.07207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867433.07213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867433.07239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867433.07284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867433.07309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867433.07460: variable 'network_connections' from source: task vars 23826 1726867433.07481: variable 'interface' from source: set_fact 23826 1726867433.07567: variable 'interface' from source: set_fact 23826 1726867433.07585: variable 'interface' from source: set_fact 23826 1726867433.07659: variable 'interface' from source: set_fact 23826 1726867433.07732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867433.07911: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867433.07963: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867433.08072: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867433.08078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867433.08084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867433.08110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867433.08140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867433.08171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867433.08248: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867433.08548: variable 'network_connections' from source: task vars 23826 1726867433.08558: variable 'interface' from source: set_fact 23826 1726867433.08632: variable 'interface' from source: set_fact 23826 1726867433.08643: variable 'interface' from source: set_fact 23826 1726867433.08704: variable 'interface' from source: set_fact 23826 1726867433.08858: Evaluated conditional (__network_wpa_supplicant_required): False 23826 1726867433.08866: when evaluation is False, skipping this task 23826 1726867433.08872: _execute() done 23826 1726867433.09036: dumping result to json 23826 1726867433.09039: done dumping result, returning 23826 1726867433.09042: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-a92d-a3ea-000000000024] 23826 1726867433.09044: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000024 23826 1726867433.09384: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000024 23826 1726867433.09387: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 23826 1726867433.09429: no more pending results, returning what we have 23826 1726867433.09433: results queue empty 23826 1726867433.09433: checking for any_errors_fatal 23826 1726867433.09451: done checking for any_errors_fatal 23826 1726867433.09452: checking for max_fail_percentage 23826 1726867433.09453: done checking for max_fail_percentage 23826 1726867433.09454: checking to see if all hosts have failed and the running result is not ok 23826 1726867433.09455: done checking to see if all hosts have failed 23826 1726867433.09456: getting the remaining hosts for this loop 23826 1726867433.09457: done getting the remaining hosts for this loop 23826 1726867433.09460: getting the next task for host managed_node2 23826 1726867433.09466: done getting next task for host managed_node2 23826 1726867433.09470: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 23826 1726867433.09473: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867433.09489: getting variables 23826 1726867433.09491: in VariableManager get_vars() 23826 1726867433.09530: Calling all_inventory to load vars for managed_node2 23826 1726867433.09533: Calling groups_inventory to load vars for managed_node2 23826 1726867433.09536: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867433.09545: Calling all_plugins_play to load vars for managed_node2 23826 1726867433.09547: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867433.09550: Calling groups_plugins_play to load vars for managed_node2 23826 1726867433.12124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867433.15402: done with get_vars() 23826 1726867433.15427: done getting variables 23826 1726867433.15603: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:23:53 -0400 (0:00:00.157) 0:00:15.167 ****** 23826 1726867433.15636: entering _queue_task() for managed_node2/service 23826 1726867433.16290: worker is 1 (out of 1 available) 23826 1726867433.16415: exiting _queue_task() for managed_node2/service 23826 1726867433.16429: done queuing things up, now waiting for results queue to drain 23826 1726867433.16430: waiting for pending results... 23826 1726867433.16995: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 23826 1726867433.17383: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000025 23826 1726867433.17387: variable 'ansible_search_path' from source: unknown 23826 1726867433.17391: variable 'ansible_search_path' from source: unknown 23826 1726867433.17393: calling self._execute() 23826 1726867433.17399: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867433.17402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867433.17405: variable 'omit' from source: magic vars 23826 1726867433.18123: variable 'ansible_distribution_major_version' from source: facts 23826 1726867433.18296: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867433.18416: variable 'network_provider' from source: set_fact 23826 1726867433.18493: Evaluated conditional (network_provider == "initscripts"): False 23826 1726867433.18502: when evaluation is False, skipping this task 23826 1726867433.18513: _execute() done 23826 1726867433.18520: dumping result to json 23826 1726867433.18528: done dumping result, returning 23826 1726867433.18540: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-a92d-a3ea-000000000025] 23826 1726867433.18551: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000025 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867433.18706: no more pending results, returning what we have 23826 1726867433.18710: results queue empty 23826 1726867433.18711: checking for any_errors_fatal 23826 1726867433.18831: done checking for any_errors_fatal 23826 1726867433.18833: checking for max_fail_percentage 23826 1726867433.18835: done checking for max_fail_percentage 23826 1726867433.18836: checking to see if all hosts have failed and the running result is not ok 23826 1726867433.18837: done checking to see if all hosts have failed 23826 1726867433.18837: getting the remaining hosts for this loop 23826 1726867433.18839: done getting the remaining hosts for this loop 23826 1726867433.18843: getting the next task for host managed_node2 23826 1726867433.18849: done getting next task for host managed_node2 23826 1726867433.18853: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 23826 1726867433.18857: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867433.18874: getting variables 23826 1726867433.18875: in VariableManager get_vars() 23826 1726867433.18919: Calling all_inventory to load vars for managed_node2 23826 1726867433.18922: Calling groups_inventory to load vars for managed_node2 23826 1726867433.18924: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867433.18938: Calling all_plugins_play to load vars for managed_node2 23826 1726867433.18941: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867433.18944: Calling groups_plugins_play to load vars for managed_node2 23826 1726867433.19586: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000025 23826 1726867433.19590: WORKER PROCESS EXITING 23826 1726867433.21938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867433.25186: done with get_vars() 23826 1726867433.25208: done getting variables 23826 1726867433.25382: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:23:53 -0400 (0:00:00.097) 0:00:15.265 ****** 23826 1726867433.25417: entering _queue_task() for managed_node2/copy 23826 1726867433.25990: worker is 1 (out of 1 available) 23826 1726867433.26003: exiting _queue_task() for managed_node2/copy 23826 1726867433.26014: done queuing things up, now waiting for results queue to drain 23826 1726867433.26016: waiting for pending results... 23826 1726867433.26549: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 23826 1726867433.26673: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000026 23826 1726867433.27083: variable 'ansible_search_path' from source: unknown 23826 1726867433.27087: variable 'ansible_search_path' from source: unknown 23826 1726867433.27089: calling self._execute() 23826 1726867433.27092: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867433.27094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867433.27096: variable 'omit' from source: magic vars 23826 1726867433.27818: variable 'ansible_distribution_major_version' from source: facts 23826 1726867433.28087: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867433.28400: variable 'network_provider' from source: set_fact 23826 1726867433.28416: Evaluated conditional (network_provider == "initscripts"): False 23826 1726867433.28425: when evaluation is False, skipping this task 23826 1726867433.28434: _execute() done 23826 1726867433.28441: dumping result to json 23826 1726867433.28449: done dumping result, returning 23826 1726867433.28463: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-a92d-a3ea-000000000026] 23826 1726867433.28473: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000026 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 23826 1726867433.28635: no more pending results, returning what we have 23826 1726867433.28638: results queue empty 23826 1726867433.28639: checking for any_errors_fatal 23826 1726867433.28648: done checking for any_errors_fatal 23826 1726867433.28649: checking for max_fail_percentage 23826 1726867433.28651: done checking for max_fail_percentage 23826 1726867433.28652: checking to see if all hosts have failed and the running result is not ok 23826 1726867433.28653: done checking to see if all hosts have failed 23826 1726867433.28653: getting the remaining hosts for this loop 23826 1726867433.28655: done getting the remaining hosts for this loop 23826 1726867433.28659: getting the next task for host managed_node2 23826 1726867433.28666: done getting next task for host managed_node2 23826 1726867433.28669: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 23826 1726867433.28672: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867433.28686: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000026 23826 1726867433.28689: WORKER PROCESS EXITING 23826 1726867433.28699: getting variables 23826 1726867433.28700: in VariableManager get_vars() 23826 1726867433.28742: Calling all_inventory to load vars for managed_node2 23826 1726867433.28745: Calling groups_inventory to load vars for managed_node2 23826 1726867433.28748: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867433.28758: Calling all_plugins_play to load vars for managed_node2 23826 1726867433.28761: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867433.28764: Calling groups_plugins_play to load vars for managed_node2 23826 1726867433.32292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867433.36389: done with get_vars() 23826 1726867433.36418: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:23:53 -0400 (0:00:00.115) 0:00:15.380 ****** 23826 1726867433.36923: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 23826 1726867433.36924: Creating lock for fedora.linux_system_roles.network_connections 23826 1726867433.37962: worker is 1 (out of 1 available) 23826 1726867433.37984: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 23826 1726867433.38002: done queuing things up, now waiting for results queue to drain 23826 1726867433.38004: waiting for pending results... 23826 1726867433.38372: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 23826 1726867433.38883: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000027 23826 1726867433.38887: variable 'ansible_search_path' from source: unknown 23826 1726867433.38889: variable 'ansible_search_path' from source: unknown 23826 1726867433.38891: calling self._execute() 23826 1726867433.38894: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867433.38896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867433.38898: variable 'omit' from source: magic vars 23826 1726867433.39614: variable 'ansible_distribution_major_version' from source: facts 23826 1726867433.39816: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867433.39824: variable 'omit' from source: magic vars 23826 1726867433.40282: variable 'omit' from source: magic vars 23826 1726867433.40285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867433.45735: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867433.46049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867433.46094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867433.46137: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867433.46168: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867433.46251: variable 'network_provider' from source: set_fact 23826 1726867433.46613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867433.46645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867433.47083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867433.47086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867433.47089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867433.47092: variable 'omit' from source: magic vars 23826 1726867433.47163: variable 'omit' from source: magic vars 23826 1726867433.47486: variable 'network_connections' from source: task vars 23826 1726867433.47504: variable 'interface' from source: set_fact 23826 1726867433.47574: variable 'interface' from source: set_fact 23826 1726867433.47883: variable 'interface' from source: set_fact 23826 1726867433.47887: variable 'interface' from source: set_fact 23826 1726867433.48031: variable 'omit' from source: magic vars 23826 1726867433.48382: variable '__lsr_ansible_managed' from source: task vars 23826 1726867433.48385: variable '__lsr_ansible_managed' from source: task vars 23826 1726867433.48814: Loaded config def from plugin (lookup/template) 23826 1726867433.48824: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 23826 1726867433.48854: File lookup term: get_ansible_managed.j2 23826 1726867433.48862: variable 'ansible_search_path' from source: unknown 23826 1726867433.48873: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 23826 1726867433.48894: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 23826 1726867433.48919: variable 'ansible_search_path' from source: unknown 23826 1726867433.61510: variable 'ansible_managed' from source: unknown 23826 1726867433.61643: variable 'omit' from source: magic vars 23826 1726867433.61916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867433.61951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867433.61979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867433.62002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867433.62022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867433.62054: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867433.62062: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867433.62069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867433.62165: Set connection var ansible_timeout to 10 23826 1726867433.62582: Set connection var ansible_shell_executable to /bin/sh 23826 1726867433.62585: Set connection var ansible_connection to ssh 23826 1726867433.62587: Set connection var ansible_pipelining to False 23826 1726867433.62590: Set connection var ansible_shell_type to sh 23826 1726867433.62591: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867433.62593: variable 'ansible_shell_executable' from source: unknown 23826 1726867433.62595: variable 'ansible_connection' from source: unknown 23826 1726867433.62597: variable 'ansible_module_compression' from source: unknown 23826 1726867433.62599: variable 'ansible_shell_type' from source: unknown 23826 1726867433.62602: variable 'ansible_shell_executable' from source: unknown 23826 1726867433.62603: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867433.62605: variable 'ansible_pipelining' from source: unknown 23826 1726867433.62610: variable 'ansible_timeout' from source: unknown 23826 1726867433.62612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867433.62796: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867433.62823: variable 'omit' from source: magic vars 23826 1726867433.62836: starting attempt loop 23826 1726867433.62843: running the handler 23826 1726867433.62875: _low_level_execute_command(): starting 23826 1726867433.62890: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867433.64182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867433.64294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867433.64316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867433.64493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867433.64527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867433.64544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867433.64644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867433.64854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867433.66554: stdout chunk (state=3): >>>/root <<< 23826 1726867433.66813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867433.66816: stdout chunk (state=3): >>><<< 23826 1726867433.66819: stderr chunk (state=3): >>><<< 23826 1726867433.66822: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867433.66824: _low_level_execute_command(): starting 23826 1726867433.66827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128 `" && echo ansible-tmp-1726867433.667284-24673-203694450251128="` echo /root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128 `" ) && sleep 0' 23826 1726867433.67921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867433.68138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867433.68180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867433.70212: stdout chunk (state=3): >>>ansible-tmp-1726867433.667284-24673-203694450251128=/root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128 <<< 23826 1726867433.70300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867433.70310: stdout chunk (state=3): >>><<< 23826 1726867433.70321: stderr chunk (state=3): >>><<< 23826 1726867433.70683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867433.667284-24673-203694450251128=/root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867433.70686: variable 'ansible_module_compression' from source: unknown 23826 1726867433.70689: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 23826 1726867433.70691: ANSIBALLZ: Acquiring lock 23826 1726867433.70693: ANSIBALLZ: Lock acquired: 139851305333920 23826 1726867433.70695: ANSIBALLZ: Creating module 23826 1726867434.11659: ANSIBALLZ: Writing module into payload 23826 1726867434.12391: ANSIBALLZ: Writing module 23826 1726867434.12419: ANSIBALLZ: Renaming module 23826 1726867434.12431: ANSIBALLZ: Done creating module 23826 1726867434.12683: variable 'ansible_facts' from source: unknown 23826 1726867434.12798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/AnsiballZ_network_connections.py 23826 1726867434.13140: Sending initial data 23826 1726867434.13143: Sent initial data (167 bytes) 23826 1726867434.14293: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867434.14468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867434.14545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867434.14619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867434.16299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867434.16415: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867434.16446: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpy02s6x_g /root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/AnsiballZ_network_connections.py <<< 23826 1726867434.16457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/AnsiballZ_network_connections.py" <<< 23826 1726867434.16517: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpy02s6x_g" to remote "/root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/AnsiballZ_network_connections.py" <<< 23826 1726867434.18760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867434.18790: stderr chunk (state=3): >>><<< 23826 1726867434.18854: stdout chunk (state=3): >>><<< 23826 1726867434.18883: done transferring module to remote 23826 1726867434.19019: _low_level_execute_command(): starting 23826 1726867434.19022: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/ /root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/AnsiballZ_network_connections.py && sleep 0' 23826 1726867434.20217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867434.20235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867434.20292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867434.20451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867434.20506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867434.20538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867434.22611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867434.22615: stderr chunk (state=3): >>><<< 23826 1726867434.22617: stdout chunk (state=3): >>><<< 23826 1726867434.22620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867434.22622: _low_level_execute_command(): starting 23826 1726867434.22625: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/AnsiballZ_network_connections.py && sleep 0' 23826 1726867434.23812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867434.23828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867434.23842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867434.23873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867434.24207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867434.24223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867434.50488: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 41871d48-5e02-497c-b448-55b0b16ff70d\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 23826 1726867434.52479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867434.52484: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 23826 1726867434.52487: stderr chunk (state=3): >>><<< 23826 1726867434.52603: stdout chunk (state=3): >>><<< 23826 1726867434.52607: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 41871d48-5e02-497c-b448-55b0b16ff70d\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867434.52610: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'type': 'ethernet', 'ip': {'ipv6_disabled': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867434.52612: _low_level_execute_command(): starting 23826 1726867434.52614: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867433.667284-24673-203694450251128/ > /dev/null 2>&1 && sleep 0' 23826 1726867434.53793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867434.53809: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867434.53892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867434.54050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867434.54130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867434.56293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867434.56297: stdout chunk (state=3): >>><<< 23826 1726867434.56299: stderr chunk (state=3): >>><<< 23826 1726867434.56302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867434.56304: handler run complete 23826 1726867434.56339: attempt loop complete, returning result 23826 1726867434.56584: _execute() done 23826 1726867434.56587: dumping result to json 23826 1726867434.56589: done dumping result, returning 23826 1726867434.56592: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-a92d-a3ea-000000000027] 23826 1726867434.56594: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000027 changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 41871d48-5e02-497c-b448-55b0b16ff70d 23826 1726867434.56885: no more pending results, returning what we have 23826 1726867434.56889: results queue empty 23826 1726867434.56890: checking for any_errors_fatal 23826 1726867434.56898: done checking for any_errors_fatal 23826 1726867434.56899: checking for max_fail_percentage 23826 1726867434.56905: done checking for max_fail_percentage 23826 1726867434.56906: checking to see if all hosts have failed and the running result is not ok 23826 1726867434.56907: done checking to see if all hosts have failed 23826 1726867434.56907: getting the remaining hosts for this loop 23826 1726867434.56909: done getting the remaining hosts for this loop 23826 1726867434.56912: getting the next task for host managed_node2 23826 1726867434.56919: done getting next task for host managed_node2 23826 1726867434.56923: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 23826 1726867434.56926: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867434.56939: getting variables 23826 1726867434.56940: in VariableManager get_vars() 23826 1726867434.57493: Calling all_inventory to load vars for managed_node2 23826 1726867434.57496: Calling groups_inventory to load vars for managed_node2 23826 1726867434.57500: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867434.57510: Calling all_plugins_play to load vars for managed_node2 23826 1726867434.57513: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867434.57516: Calling groups_plugins_play to load vars for managed_node2 23826 1726867434.58148: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000027 23826 1726867434.58152: WORKER PROCESS EXITING 23826 1726867434.61356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867434.64933: done with get_vars() 23826 1726867434.64961: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:23:54 -0400 (0:00:01.284) 0:00:16.665 ****** 23826 1726867434.65372: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 23826 1726867434.65374: Creating lock for fedora.linux_system_roles.network_state 23826 1726867434.66230: worker is 1 (out of 1 available) 23826 1726867434.66243: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 23826 1726867434.66257: done queuing things up, now waiting for results queue to drain 23826 1726867434.66259: waiting for pending results... 23826 1726867434.66646: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 23826 1726867434.66926: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000028 23826 1726867434.67015: variable 'ansible_search_path' from source: unknown 23826 1726867434.67024: variable 'ansible_search_path' from source: unknown 23826 1726867434.67213: calling self._execute() 23826 1726867434.67365: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867434.67371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867434.67383: variable 'omit' from source: magic vars 23826 1726867434.68753: variable 'ansible_distribution_major_version' from source: facts 23826 1726867434.68764: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867434.69029: variable 'network_state' from source: role '' defaults 23826 1726867434.69036: Evaluated conditional (network_state != {}): False 23826 1726867434.69040: when evaluation is False, skipping this task 23826 1726867434.69043: _execute() done 23826 1726867434.69045: dumping result to json 23826 1726867434.69049: done dumping result, returning 23826 1726867434.69268: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-a92d-a3ea-000000000028] 23826 1726867434.69316: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000028 23826 1726867434.69547: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000028 23826 1726867434.69550: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867434.69735: no more pending results, returning what we have 23826 1726867434.69744: results queue empty 23826 1726867434.69745: checking for any_errors_fatal 23826 1726867434.69761: done checking for any_errors_fatal 23826 1726867434.69762: checking for max_fail_percentage 23826 1726867434.69764: done checking for max_fail_percentage 23826 1726867434.69765: checking to see if all hosts have failed and the running result is not ok 23826 1726867434.69766: done checking to see if all hosts have failed 23826 1726867434.69767: getting the remaining hosts for this loop 23826 1726867434.69769: done getting the remaining hosts for this loop 23826 1726867434.69775: getting the next task for host managed_node2 23826 1726867434.69788: done getting next task for host managed_node2 23826 1726867434.69792: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 23826 1726867434.69798: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867434.69822: getting variables 23826 1726867434.69825: in VariableManager get_vars() 23826 1726867434.70331: Calling all_inventory to load vars for managed_node2 23826 1726867434.70337: Calling groups_inventory to load vars for managed_node2 23826 1726867434.70343: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867434.70354: Calling all_plugins_play to load vars for managed_node2 23826 1726867434.70357: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867434.70362: Calling groups_plugins_play to load vars for managed_node2 23826 1726867434.74045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867434.77780: done with get_vars() 23826 1726867434.77809: done getting variables 23826 1726867434.77868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:23:54 -0400 (0:00:00.125) 0:00:16.790 ****** 23826 1726867434.77911: entering _queue_task() for managed_node2/debug 23826 1726867434.78429: worker is 1 (out of 1 available) 23826 1726867434.78442: exiting _queue_task() for managed_node2/debug 23826 1726867434.78470: done queuing things up, now waiting for results queue to drain 23826 1726867434.78472: waiting for pending results... 23826 1726867434.78822: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 23826 1726867434.79191: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000029 23826 1726867434.79210: variable 'ansible_search_path' from source: unknown 23826 1726867434.79214: variable 'ansible_search_path' from source: unknown 23826 1726867434.79357: calling self._execute() 23826 1726867434.79461: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867434.79465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867434.79470: variable 'omit' from source: magic vars 23826 1726867434.80466: variable 'ansible_distribution_major_version' from source: facts 23826 1726867434.80736: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867434.80742: variable 'omit' from source: magic vars 23826 1726867434.80745: variable 'omit' from source: magic vars 23826 1726867434.80748: variable 'omit' from source: magic vars 23826 1726867434.81198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867434.81202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867434.81204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867434.81295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867434.81299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867434.81301: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867434.81304: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867434.81306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867434.81630: Set connection var ansible_timeout to 10 23826 1726867434.81638: Set connection var ansible_shell_executable to /bin/sh 23826 1726867434.81641: Set connection var ansible_connection to ssh 23826 1726867434.81649: Set connection var ansible_pipelining to False 23826 1726867434.81652: Set connection var ansible_shell_type to sh 23826 1726867434.81658: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867434.81926: variable 'ansible_shell_executable' from source: unknown 23826 1726867434.81930: variable 'ansible_connection' from source: unknown 23826 1726867434.81933: variable 'ansible_module_compression' from source: unknown 23826 1726867434.81935: variable 'ansible_shell_type' from source: unknown 23826 1726867434.81937: variable 'ansible_shell_executable' from source: unknown 23826 1726867434.81945: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867434.81947: variable 'ansible_pipelining' from source: unknown 23826 1726867434.81949: variable 'ansible_timeout' from source: unknown 23826 1726867434.81951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867434.82494: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867434.82498: variable 'omit' from source: magic vars 23826 1726867434.82502: starting attempt loop 23826 1726867434.82508: running the handler 23826 1726867434.82944: variable '__network_connections_result' from source: set_fact 23826 1726867434.82947: handler run complete 23826 1726867434.82950: attempt loop complete, returning result 23826 1726867434.82952: _execute() done 23826 1726867434.82954: dumping result to json 23826 1726867434.82956: done dumping result, returning 23826 1726867434.82958: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-a92d-a3ea-000000000029] 23826 1726867434.82960: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000029 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 41871d48-5e02-497c-b448-55b0b16ff70d" ] } 23826 1726867434.83287: no more pending results, returning what we have 23826 1726867434.83290: results queue empty 23826 1726867434.83291: checking for any_errors_fatal 23826 1726867434.83295: done checking for any_errors_fatal 23826 1726867434.83295: checking for max_fail_percentage 23826 1726867434.83297: done checking for max_fail_percentage 23826 1726867434.83297: checking to see if all hosts have failed and the running result is not ok 23826 1726867434.83298: done checking to see if all hosts have failed 23826 1726867434.83299: getting the remaining hosts for this loop 23826 1726867434.83300: done getting the remaining hosts for this loop 23826 1726867434.83303: getting the next task for host managed_node2 23826 1726867434.83309: done getting next task for host managed_node2 23826 1726867434.83313: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 23826 1726867434.83315: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867434.83326: getting variables 23826 1726867434.83328: in VariableManager get_vars() 23826 1726867434.83365: Calling all_inventory to load vars for managed_node2 23826 1726867434.83367: Calling groups_inventory to load vars for managed_node2 23826 1726867434.83370: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867434.83380: Calling all_plugins_play to load vars for managed_node2 23826 1726867434.83383: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867434.83390: Calling groups_plugins_play to load vars for managed_node2 23826 1726867434.84025: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000029 23826 1726867434.84029: WORKER PROCESS EXITING 23826 1726867434.86867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867434.90632: done with get_vars() 23826 1726867434.90686: done getting variables 23826 1726867434.90817: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:23:54 -0400 (0:00:00.129) 0:00:16.919 ****** 23826 1726867434.90860: entering _queue_task() for managed_node2/debug 23826 1726867434.91343: worker is 1 (out of 1 available) 23826 1726867434.91384: exiting _queue_task() for managed_node2/debug 23826 1726867434.91397: done queuing things up, now waiting for results queue to drain 23826 1726867434.91398: waiting for pending results... 23826 1726867434.91797: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 23826 1726867434.92376: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000002a 23826 1726867434.92382: variable 'ansible_search_path' from source: unknown 23826 1726867434.92384: variable 'ansible_search_path' from source: unknown 23826 1726867434.92432: calling self._execute() 23826 1726867434.92652: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867434.92658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867434.92683: variable 'omit' from source: magic vars 23826 1726867434.93526: variable 'ansible_distribution_major_version' from source: facts 23826 1726867434.93536: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867434.93562: variable 'omit' from source: magic vars 23826 1726867434.93596: variable 'omit' from source: magic vars 23826 1726867434.93750: variable 'omit' from source: magic vars 23826 1726867434.93800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867434.93885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867434.93958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867434.94060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867434.94075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867434.94117: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867434.94124: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867434.94126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867434.94455: Set connection var ansible_timeout to 10 23826 1726867434.94458: Set connection var ansible_shell_executable to /bin/sh 23826 1726867434.94460: Set connection var ansible_connection to ssh 23826 1726867434.94462: Set connection var ansible_pipelining to False 23826 1726867434.94465: Set connection var ansible_shell_type to sh 23826 1726867434.94467: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867434.94535: variable 'ansible_shell_executable' from source: unknown 23826 1726867434.94538: variable 'ansible_connection' from source: unknown 23826 1726867434.94541: variable 'ansible_module_compression' from source: unknown 23826 1726867434.94543: variable 'ansible_shell_type' from source: unknown 23826 1726867434.94546: variable 'ansible_shell_executable' from source: unknown 23826 1726867434.94548: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867434.94553: variable 'ansible_pipelining' from source: unknown 23826 1726867434.94563: variable 'ansible_timeout' from source: unknown 23826 1726867434.94565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867434.94972: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867434.95012: variable 'omit' from source: magic vars 23826 1726867434.95082: starting attempt loop 23826 1726867434.95086: running the handler 23826 1726867434.95088: variable '__network_connections_result' from source: set_fact 23826 1726867434.95258: variable '__network_connections_result' from source: set_fact 23826 1726867434.95495: handler run complete 23826 1726867434.95572: attempt loop complete, returning result 23826 1726867434.95575: _execute() done 23826 1726867434.95579: dumping result to json 23826 1726867434.95581: done dumping result, returning 23826 1726867434.95712: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-a92d-a3ea-00000000002a] 23826 1726867434.95731: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000002a 23826 1726867434.95942: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000002a 23826 1726867434.95945: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 41871d48-5e02-497c-b448-55b0b16ff70d\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 41871d48-5e02-497c-b448-55b0b16ff70d" ] } } 23826 1726867434.96040: no more pending results, returning what we have 23826 1726867434.96043: results queue empty 23826 1726867434.96044: checking for any_errors_fatal 23826 1726867434.96050: done checking for any_errors_fatal 23826 1726867434.96050: checking for max_fail_percentage 23826 1726867434.96053: done checking for max_fail_percentage 23826 1726867434.96054: checking to see if all hosts have failed and the running result is not ok 23826 1726867434.96055: done checking to see if all hosts have failed 23826 1726867434.96055: getting the remaining hosts for this loop 23826 1726867434.96057: done getting the remaining hosts for this loop 23826 1726867434.96060: getting the next task for host managed_node2 23826 1726867434.96068: done getting next task for host managed_node2 23826 1726867434.96072: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 23826 1726867434.96076: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867434.96088: getting variables 23826 1726867434.96090: in VariableManager get_vars() 23826 1726867434.96125: Calling all_inventory to load vars for managed_node2 23826 1726867434.96128: Calling groups_inventory to load vars for managed_node2 23826 1726867434.96131: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867434.96139: Calling all_plugins_play to load vars for managed_node2 23826 1726867434.96143: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867434.96146: Calling groups_plugins_play to load vars for managed_node2 23826 1726867434.99845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867435.03033: done with get_vars() 23826 1726867435.03058: done getting variables 23826 1726867435.03240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:23:55 -0400 (0:00:00.124) 0:00:17.044 ****** 23826 1726867435.03390: entering _queue_task() for managed_node2/debug 23826 1726867435.04147: worker is 1 (out of 1 available) 23826 1726867435.04162: exiting _queue_task() for managed_node2/debug 23826 1726867435.04174: done queuing things up, now waiting for results queue to drain 23826 1726867435.04175: waiting for pending results... 23826 1726867435.04489: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 23826 1726867435.04833: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000002b 23826 1726867435.04883: variable 'ansible_search_path' from source: unknown 23826 1726867435.04887: variable 'ansible_search_path' from source: unknown 23826 1726867435.04890: calling self._execute() 23826 1726867435.05167: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867435.05171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867435.05183: variable 'omit' from source: magic vars 23826 1726867435.05975: variable 'ansible_distribution_major_version' from source: facts 23826 1726867435.05988: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867435.06235: variable 'network_state' from source: role '' defaults 23826 1726867435.06435: Evaluated conditional (network_state != {}): False 23826 1726867435.06439: when evaluation is False, skipping this task 23826 1726867435.06441: _execute() done 23826 1726867435.06444: dumping result to json 23826 1726867435.06446: done dumping result, returning 23826 1726867435.06449: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-a92d-a3ea-00000000002b] 23826 1726867435.06452: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000002b 23826 1726867435.06523: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000002b 23826 1726867435.06526: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 23826 1726867435.06571: no more pending results, returning what we have 23826 1726867435.06575: results queue empty 23826 1726867435.06578: checking for any_errors_fatal 23826 1726867435.06589: done checking for any_errors_fatal 23826 1726867435.06590: checking for max_fail_percentage 23826 1726867435.06592: done checking for max_fail_percentage 23826 1726867435.06593: checking to see if all hosts have failed and the running result is not ok 23826 1726867435.06594: done checking to see if all hosts have failed 23826 1726867435.06595: getting the remaining hosts for this loop 23826 1726867435.06597: done getting the remaining hosts for this loop 23826 1726867435.06600: getting the next task for host managed_node2 23826 1726867435.06612: done getting next task for host managed_node2 23826 1726867435.06615: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 23826 1726867435.06619: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867435.06635: getting variables 23826 1726867435.06637: in VariableManager get_vars() 23826 1726867435.06674: Calling all_inventory to load vars for managed_node2 23826 1726867435.06679: Calling groups_inventory to load vars for managed_node2 23826 1726867435.06681: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867435.06692: Calling all_plugins_play to load vars for managed_node2 23826 1726867435.06694: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867435.06696: Calling groups_plugins_play to load vars for managed_node2 23826 1726867435.09349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867435.12810: done with get_vars() 23826 1726867435.12834: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:23:55 -0400 (0:00:00.096) 0:00:17.140 ****** 23826 1726867435.12942: entering _queue_task() for managed_node2/ping 23826 1726867435.12944: Creating lock for ping 23826 1726867435.13537: worker is 1 (out of 1 available) 23826 1726867435.13550: exiting _queue_task() for managed_node2/ping 23826 1726867435.13562: done queuing things up, now waiting for results queue to drain 23826 1726867435.13564: waiting for pending results... 23826 1726867435.13933: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 23826 1726867435.13959: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000002c 23826 1726867435.13974: variable 'ansible_search_path' from source: unknown 23826 1726867435.13979: variable 'ansible_search_path' from source: unknown 23826 1726867435.14029: calling self._execute() 23826 1726867435.14110: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867435.14114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867435.14136: variable 'omit' from source: magic vars 23826 1726867435.14518: variable 'ansible_distribution_major_version' from source: facts 23826 1726867435.14530: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867435.14536: variable 'omit' from source: magic vars 23826 1726867435.14599: variable 'omit' from source: magic vars 23826 1726867435.14635: variable 'omit' from source: magic vars 23826 1726867435.14682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867435.14719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867435.14738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867435.14755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867435.14767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867435.14805: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867435.14812: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867435.14814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867435.14934: Set connection var ansible_timeout to 10 23826 1726867435.14949: Set connection var ansible_shell_executable to /bin/sh 23826 1726867435.14953: Set connection var ansible_connection to ssh 23826 1726867435.14961: Set connection var ansible_pipelining to False 23826 1726867435.14964: Set connection var ansible_shell_type to sh 23826 1726867435.14969: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867435.15002: variable 'ansible_shell_executable' from source: unknown 23826 1726867435.15005: variable 'ansible_connection' from source: unknown 23826 1726867435.15010: variable 'ansible_module_compression' from source: unknown 23826 1726867435.15013: variable 'ansible_shell_type' from source: unknown 23826 1726867435.15015: variable 'ansible_shell_executable' from source: unknown 23826 1726867435.15018: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867435.15020: variable 'ansible_pipelining' from source: unknown 23826 1726867435.15022: variable 'ansible_timeout' from source: unknown 23826 1726867435.15024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867435.15241: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867435.15250: variable 'omit' from source: magic vars 23826 1726867435.15256: starting attempt loop 23826 1726867435.15259: running the handler 23826 1726867435.15273: _low_level_execute_command(): starting 23826 1726867435.15283: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867435.16353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867435.16369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867435.16406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867435.16444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.16505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867435.16598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.16616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.16649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.18332: stdout chunk (state=3): >>>/root <<< 23826 1726867435.18490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.18494: stdout chunk (state=3): >>><<< 23826 1726867435.18496: stderr chunk (state=3): >>><<< 23826 1726867435.18617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867435.18623: _low_level_execute_command(): starting 23826 1726867435.18626: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004 `" && echo ansible-tmp-1726867435.1853096-24722-114173775166004="` echo /root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004 `" ) && sleep 0' 23826 1726867435.19198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867435.19213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867435.19238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867435.19268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867435.19289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867435.19359: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.19427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.19468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.19528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.21536: stdout chunk (state=3): >>>ansible-tmp-1726867435.1853096-24722-114173775166004=/root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004 <<< 23826 1726867435.21719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.21726: stdout chunk (state=3): >>><<< 23826 1726867435.21729: stderr chunk (state=3): >>><<< 23826 1726867435.21806: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867435.1853096-24722-114173775166004=/root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867435.21813: variable 'ansible_module_compression' from source: unknown 23826 1726867435.21862: ANSIBALLZ: Using lock for ping 23826 1726867435.21869: ANSIBALLZ: Acquiring lock 23826 1726867435.21879: ANSIBALLZ: Lock acquired: 139851305208480 23826 1726867435.21916: ANSIBALLZ: Creating module 23826 1726867435.33532: ANSIBALLZ: Writing module into payload 23826 1726867435.33592: ANSIBALLZ: Writing module 23826 1726867435.33613: ANSIBALLZ: Renaming module 23826 1726867435.33682: ANSIBALLZ: Done creating module 23826 1726867435.33685: variable 'ansible_facts' from source: unknown 23826 1726867435.33711: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/AnsiballZ_ping.py 23826 1726867435.33942: Sending initial data 23826 1726867435.33946: Sent initial data (153 bytes) 23826 1726867435.34521: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.34552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.34629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.36315: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 23826 1726867435.36327: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867435.36364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867435.36428: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpetfpjek8 /root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/AnsiballZ_ping.py <<< 23826 1726867435.36432: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/AnsiballZ_ping.py" <<< 23826 1726867435.36473: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpetfpjek8" to remote "/root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/AnsiballZ_ping.py" <<< 23826 1726867435.37184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.37192: stderr chunk (state=3): >>><<< 23826 1726867435.37203: stdout chunk (state=3): >>><<< 23826 1726867435.37244: done transferring module to remote 23826 1726867435.37256: _low_level_execute_command(): starting 23826 1726867435.37383: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/ /root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/AnsiballZ_ping.py && sleep 0' 23826 1726867435.37942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.38092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.38176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.40087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.40094: stdout chunk (state=3): >>><<< 23826 1726867435.40097: stderr chunk (state=3): >>><<< 23826 1726867435.40253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867435.40257: _low_level_execute_command(): starting 23826 1726867435.40259: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/AnsiballZ_ping.py && sleep 0' 23826 1726867435.41070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.41080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867435.41083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.41234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.41382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.56696: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 23826 1726867435.58111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867435.58116: stdout chunk (state=3): >>><<< 23826 1726867435.58118: stderr chunk (state=3): >>><<< 23826 1726867435.58135: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867435.58160: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867435.58170: _low_level_execute_command(): starting 23826 1726867435.58175: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867435.1853096-24722-114173775166004/ > /dev/null 2>&1 && sleep 0' 23826 1726867435.59027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867435.59285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.59494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.59720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.61572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.61575: stdout chunk (state=3): >>><<< 23826 1726867435.61584: stderr chunk (state=3): >>><<< 23826 1726867435.61610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867435.61619: handler run complete 23826 1726867435.61622: attempt loop complete, returning result 23826 1726867435.61624: _execute() done 23826 1726867435.61627: dumping result to json 23826 1726867435.61629: done dumping result, returning 23826 1726867435.61639: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-a92d-a3ea-00000000002c] 23826 1726867435.61642: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000002c ok: [managed_node2] => { "changed": false, "ping": "pong" } 23826 1726867435.61918: no more pending results, returning what we have 23826 1726867435.61921: results queue empty 23826 1726867435.61922: checking for any_errors_fatal 23826 1726867435.61930: done checking for any_errors_fatal 23826 1726867435.61931: checking for max_fail_percentage 23826 1726867435.61933: done checking for max_fail_percentage 23826 1726867435.61934: checking to see if all hosts have failed and the running result is not ok 23826 1726867435.61935: done checking to see if all hosts have failed 23826 1726867435.61936: getting the remaining hosts for this loop 23826 1726867435.61938: done getting the remaining hosts for this loop 23826 1726867435.61942: getting the next task for host managed_node2 23826 1726867435.61951: done getting next task for host managed_node2 23826 1726867435.61953: ^ task is: TASK: meta (role_complete) 23826 1726867435.61955: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867435.61967: getting variables 23826 1726867435.61968: in VariableManager get_vars() 23826 1726867435.62014: Calling all_inventory to load vars for managed_node2 23826 1726867435.62017: Calling groups_inventory to load vars for managed_node2 23826 1726867435.62019: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867435.62029: Calling all_plugins_play to load vars for managed_node2 23826 1726867435.62032: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867435.62034: Calling groups_plugins_play to load vars for managed_node2 23826 1726867435.62637: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000002c 23826 1726867435.62640: WORKER PROCESS EXITING 23826 1726867435.65379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867435.67805: done with get_vars() 23826 1726867435.67830: done getting variables 23826 1726867435.67912: done queuing things up, now waiting for results queue to drain 23826 1726867435.67914: results queue empty 23826 1726867435.67915: checking for any_errors_fatal 23826 1726867435.67917: done checking for any_errors_fatal 23826 1726867435.67918: checking for max_fail_percentage 23826 1726867435.67919: done checking for max_fail_percentage 23826 1726867435.67920: checking to see if all hosts have failed and the running result is not ok 23826 1726867435.67920: done checking to see if all hosts have failed 23826 1726867435.67921: getting the remaining hosts for this loop 23826 1726867435.67922: done getting the remaining hosts for this loop 23826 1726867435.67925: getting the next task for host managed_node2 23826 1726867435.67930: done getting next task for host managed_node2 23826 1726867435.67932: ^ task is: TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 23826 1726867435.67934: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867435.67936: getting variables 23826 1726867435.67937: in VariableManager get_vars() 23826 1726867435.67950: Calling all_inventory to load vars for managed_node2 23826 1726867435.67952: Calling groups_inventory to load vars for managed_node2 23826 1726867435.67954: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867435.67990: Calling all_plugins_play to load vars for managed_node2 23826 1726867435.67994: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867435.67997: Calling groups_plugins_play to load vars for managed_node2 23826 1726867435.69228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867435.71344: done with get_vars() 23826 1726867435.71364: done getting variables 23826 1726867435.71437: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:41 Friday 20 September 2024 17:23:55 -0400 (0:00:00.585) 0:00:17.726 ****** 23826 1726867435.71503: entering _queue_task() for managed_node2/assert 23826 1726867435.72072: worker is 1 (out of 1 available) 23826 1726867435.72088: exiting _queue_task() for managed_node2/assert 23826 1726867435.72100: done queuing things up, now waiting for results queue to drain 23826 1726867435.72101: waiting for pending results... 23826 1726867435.72670: running TaskExecutor() for managed_node2/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 23826 1726867435.72747: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000005c 23826 1726867435.72763: variable 'ansible_search_path' from source: unknown 23826 1726867435.72978: calling self._execute() 23826 1726867435.73115: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867435.73289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867435.73293: variable 'omit' from source: magic vars 23826 1726867435.73523: variable 'ansible_distribution_major_version' from source: facts 23826 1726867435.73540: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867435.73651: variable '__network_connections_result' from source: set_fact 23826 1726867435.73670: Evaluated conditional (__network_connections_result.failed): False 23826 1726867435.73673: when evaluation is False, skipping this task 23826 1726867435.73676: _execute() done 23826 1726867435.73681: dumping result to json 23826 1726867435.73684: done dumping result, returning 23826 1726867435.73689: done running TaskExecutor() for managed_node2/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it [0affcac9-a3a5-a92d-a3ea-00000000005c] 23826 1726867435.73695: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000005c skipping: [managed_node2] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 23826 1726867435.73834: no more pending results, returning what we have 23826 1726867435.73839: results queue empty 23826 1726867435.73840: checking for any_errors_fatal 23826 1726867435.73842: done checking for any_errors_fatal 23826 1726867435.73842: checking for max_fail_percentage 23826 1726867435.73844: done checking for max_fail_percentage 23826 1726867435.73845: checking to see if all hosts have failed and the running result is not ok 23826 1726867435.73846: done checking to see if all hosts have failed 23826 1726867435.73847: getting the remaining hosts for this loop 23826 1726867435.73849: done getting the remaining hosts for this loop 23826 1726867435.73853: getting the next task for host managed_node2 23826 1726867435.73860: done getting next task for host managed_node2 23826 1726867435.73862: ^ task is: TASK: Verify nmcli connection ipv6.method 23826 1726867435.73865: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867435.73869: getting variables 23826 1726867435.73871: in VariableManager get_vars() 23826 1726867435.73913: Calling all_inventory to load vars for managed_node2 23826 1726867435.73916: Calling groups_inventory to load vars for managed_node2 23826 1726867435.73918: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867435.73934: Calling all_plugins_play to load vars for managed_node2 23826 1726867435.73937: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867435.73942: Calling groups_plugins_play to load vars for managed_node2 23826 1726867435.74460: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000005c 23826 1726867435.74464: WORKER PROCESS EXITING 23826 1726867435.76910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867435.79286: done with get_vars() 23826 1726867435.79306: done getting variables 23826 1726867435.79379: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Verify nmcli connection ipv6.method] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:48 Friday 20 September 2024 17:23:55 -0400 (0:00:00.078) 0:00:17.805 ****** 23826 1726867435.79401: entering _queue_task() for managed_node2/shell 23826 1726867435.79403: Creating lock for shell 23826 1726867435.79658: worker is 1 (out of 1 available) 23826 1726867435.79671: exiting _queue_task() for managed_node2/shell 23826 1726867435.79684: done queuing things up, now waiting for results queue to drain 23826 1726867435.79686: waiting for pending results... 23826 1726867435.79865: running TaskExecutor() for managed_node2/TASK: Verify nmcli connection ipv6.method 23826 1726867435.79932: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000005d 23826 1726867435.79944: variable 'ansible_search_path' from source: unknown 23826 1726867435.79973: calling self._execute() 23826 1726867435.80046: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867435.80051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867435.80062: variable 'omit' from source: magic vars 23826 1726867435.80324: variable 'ansible_distribution_major_version' from source: facts 23826 1726867435.80333: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867435.80419: variable '__network_connections_result' from source: set_fact 23826 1726867435.80433: Evaluated conditional (not __network_connections_result.failed): True 23826 1726867435.80438: variable 'omit' from source: magic vars 23826 1726867435.80461: variable 'omit' from source: magic vars 23826 1726867435.80525: variable 'interface' from source: set_fact 23826 1726867435.80539: variable 'omit' from source: magic vars 23826 1726867435.80574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867435.80604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867435.80620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867435.80634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867435.80644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867435.80667: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867435.80675: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867435.80680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867435.80749: Set connection var ansible_timeout to 10 23826 1726867435.80756: Set connection var ansible_shell_executable to /bin/sh 23826 1726867435.80759: Set connection var ansible_connection to ssh 23826 1726867435.80766: Set connection var ansible_pipelining to False 23826 1726867435.80768: Set connection var ansible_shell_type to sh 23826 1726867435.80773: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867435.80793: variable 'ansible_shell_executable' from source: unknown 23826 1726867435.80797: variable 'ansible_connection' from source: unknown 23826 1726867435.80800: variable 'ansible_module_compression' from source: unknown 23826 1726867435.80802: variable 'ansible_shell_type' from source: unknown 23826 1726867435.80804: variable 'ansible_shell_executable' from source: unknown 23826 1726867435.80806: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867435.80812: variable 'ansible_pipelining' from source: unknown 23826 1726867435.80815: variable 'ansible_timeout' from source: unknown 23826 1726867435.80817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867435.80917: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867435.80932: variable 'omit' from source: magic vars 23826 1726867435.80936: starting attempt loop 23826 1726867435.80938: running the handler 23826 1726867435.80943: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867435.80960: _low_level_execute_command(): starting 23826 1726867435.80967: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867435.81596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.81600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.81680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.83358: stdout chunk (state=3): >>>/root <<< 23826 1726867435.83459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.83480: stderr chunk (state=3): >>><<< 23826 1726867435.83483: stdout chunk (state=3): >>><<< 23826 1726867435.83502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867435.83516: _low_level_execute_command(): starting 23826 1726867435.83521: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397 `" && echo ansible-tmp-1726867435.8350213-24755-110769586450397="` echo /root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397 `" ) && sleep 0' 23826 1726867435.83921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867435.83925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.83933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867435.83951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.83984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867435.83988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.84037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.86037: stdout chunk (state=3): >>>ansible-tmp-1726867435.8350213-24755-110769586450397=/root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397 <<< 23826 1726867435.86158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.86171: stdout chunk (state=3): >>><<< 23826 1726867435.86186: stderr chunk (state=3): >>><<< 23826 1726867435.86197: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867435.8350213-24755-110769586450397=/root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867435.86223: variable 'ansible_module_compression' from source: unknown 23826 1726867435.86262: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867435.86289: variable 'ansible_facts' from source: unknown 23826 1726867435.86346: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/AnsiballZ_command.py 23826 1726867435.86437: Sending initial data 23826 1726867435.86440: Sent initial data (156 bytes) 23826 1726867435.86915: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867435.86920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867435.86922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.86925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867435.86927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.86975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.86984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.87024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.88644: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867435.88705: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867435.88752: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpfdl0uf5i /root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/AnsiballZ_command.py <<< 23826 1726867435.88755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/AnsiballZ_command.py" <<< 23826 1726867435.88789: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpfdl0uf5i" to remote "/root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/AnsiballZ_command.py" <<< 23826 1726867435.89617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.89657: stderr chunk (state=3): >>><<< 23826 1726867435.89662: stdout chunk (state=3): >>><<< 23826 1726867435.89686: done transferring module to remote 23826 1726867435.89689: _low_level_execute_command(): starting 23826 1726867435.89721: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/ /root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/AnsiballZ_command.py && sleep 0' 23826 1726867435.90544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867435.90573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.90662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.90727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867435.90749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.90792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.90884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867435.92789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867435.92839: stderr chunk (state=3): >>><<< 23826 1726867435.92843: stdout chunk (state=3): >>><<< 23826 1726867435.92970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867435.92973: _low_level_execute_command(): starting 23826 1726867435.92976: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/AnsiballZ_command.py && sleep 0' 23826 1726867435.93760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867435.93856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867435.93880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867435.93942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867435.94019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867435.94093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867435.94171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867436.11653: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-20 17:23:56.096846", "end": "2024-09-20 17:23:56.114909", "delta": "0:00:00.018063", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867436.13445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867436.13450: stdout chunk (state=3): >>><<< 23826 1726867436.13452: stderr chunk (state=3): >>><<< 23826 1726867436.13635: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-20 17:23:56.096846", "end": "2024-09-20 17:23:56.114909", "delta": "0:00:00.018063", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867436.13640: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867436.13643: _low_level_execute_command(): starting 23826 1726867436.13646: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867435.8350213-24755-110769586450397/ > /dev/null 2>&1 && sleep 0' 23826 1726867436.14431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867436.14538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867436.14571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867436.16504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867436.16509: stdout chunk (state=3): >>><<< 23826 1726867436.16513: stderr chunk (state=3): >>><<< 23826 1726867436.16541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867436.16683: handler run complete 23826 1726867436.16686: Evaluated conditional (False): False 23826 1726867436.16688: attempt loop complete, returning result 23826 1726867436.16690: _execute() done 23826 1726867436.16692: dumping result to json 23826 1726867436.16694: done dumping result, returning 23826 1726867436.16696: done running TaskExecutor() for managed_node2/TASK: Verify nmcli connection ipv6.method [0affcac9-a3a5-a92d-a3ea-00000000005d] 23826 1726867436.16698: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000005d 23826 1726867436.16769: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000005d 23826 1726867436.16773: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "delta": "0:00:00.018063", "end": "2024-09-20 17:23:56.114909", "rc": 0, "start": "2024-09-20 17:23:56.096846" } STDOUT: ipv6.method: disabled STDERR: + nmcli connection show ethtest0 + grep ipv6.method 23826 1726867436.16868: no more pending results, returning what we have 23826 1726867436.16872: results queue empty 23826 1726867436.16873: checking for any_errors_fatal 23826 1726867436.16905: done checking for any_errors_fatal 23826 1726867436.16906: checking for max_fail_percentage 23826 1726867436.16911: done checking for max_fail_percentage 23826 1726867436.16912: checking to see if all hosts have failed and the running result is not ok 23826 1726867436.16913: done checking to see if all hosts have failed 23826 1726867436.16914: getting the remaining hosts for this loop 23826 1726867436.16915: done getting the remaining hosts for this loop 23826 1726867436.16919: getting the next task for host managed_node2 23826 1726867436.16926: done getting next task for host managed_node2 23826 1726867436.16929: ^ task is: TASK: Assert that ipv6.method disabled is configured correctly 23826 1726867436.16933: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867436.16937: getting variables 23826 1726867436.17080: in VariableManager get_vars() 23826 1726867436.17159: Calling all_inventory to load vars for managed_node2 23826 1726867436.17162: Calling groups_inventory to load vars for managed_node2 23826 1726867436.17164: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867436.17175: Calling all_plugins_play to load vars for managed_node2 23826 1726867436.17180: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867436.17183: Calling groups_plugins_play to load vars for managed_node2 23826 1726867436.19758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867436.22201: done with get_vars() 23826 1726867436.22231: done getting variables 23826 1726867436.22475: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that ipv6.method disabled is configured correctly] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:57 Friday 20 September 2024 17:23:56 -0400 (0:00:00.431) 0:00:18.236 ****** 23826 1726867436.22658: entering _queue_task() for managed_node2/assert 23826 1726867436.23412: worker is 1 (out of 1 available) 23826 1726867436.23424: exiting _queue_task() for managed_node2/assert 23826 1726867436.23437: done queuing things up, now waiting for results queue to drain 23826 1726867436.23441: waiting for pending results... 23826 1726867436.24050: running TaskExecutor() for managed_node2/TASK: Assert that ipv6.method disabled is configured correctly 23826 1726867436.24056: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000005e 23826 1726867436.24058: variable 'ansible_search_path' from source: unknown 23826 1726867436.24061: calling self._execute() 23826 1726867436.24064: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867436.24066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867436.24069: variable 'omit' from source: magic vars 23826 1726867436.24604: variable 'ansible_distribution_major_version' from source: facts 23826 1726867436.24613: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867436.24738: variable '__network_connections_result' from source: set_fact 23826 1726867436.24802: Evaluated conditional (not __network_connections_result.failed): True 23826 1726867436.24811: variable 'omit' from source: magic vars 23826 1726867436.24815: variable 'omit' from source: magic vars 23826 1726867436.24827: variable 'omit' from source: magic vars 23826 1726867436.24970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867436.25111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867436.25127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867436.25145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867436.25158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867436.25190: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867436.25193: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867436.25196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867436.25297: Set connection var ansible_timeout to 10 23826 1726867436.25306: Set connection var ansible_shell_executable to /bin/sh 23826 1726867436.25312: Set connection var ansible_connection to ssh 23826 1726867436.25322: Set connection var ansible_pipelining to False 23826 1726867436.25324: Set connection var ansible_shell_type to sh 23826 1726867436.25331: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867436.25391: variable 'ansible_shell_executable' from source: unknown 23826 1726867436.25395: variable 'ansible_connection' from source: unknown 23826 1726867436.25400: variable 'ansible_module_compression' from source: unknown 23826 1726867436.25402: variable 'ansible_shell_type' from source: unknown 23826 1726867436.25404: variable 'ansible_shell_executable' from source: unknown 23826 1726867436.25410: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867436.25414: variable 'ansible_pipelining' from source: unknown 23826 1726867436.25417: variable 'ansible_timeout' from source: unknown 23826 1726867436.25501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867436.25585: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867436.25612: variable 'omit' from source: magic vars 23826 1726867436.25615: starting attempt loop 23826 1726867436.25617: running the handler 23826 1726867436.25830: variable 'ipv6_method' from source: set_fact 23826 1726867436.25833: Evaluated conditional ('disabled' in ipv6_method.stdout): True 23826 1726867436.25835: handler run complete 23826 1726867436.25837: attempt loop complete, returning result 23826 1726867436.25839: _execute() done 23826 1726867436.25842: dumping result to json 23826 1726867436.25844: done dumping result, returning 23826 1726867436.25846: done running TaskExecutor() for managed_node2/TASK: Assert that ipv6.method disabled is configured correctly [0affcac9-a3a5-a92d-a3ea-00000000005e] 23826 1726867436.25849: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000005e 23826 1726867436.25984: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000005e 23826 1726867436.26027: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 23826 1726867436.26076: no more pending results, returning what we have 23826 1726867436.26081: results queue empty 23826 1726867436.26083: checking for any_errors_fatal 23826 1726867436.26089: done checking for any_errors_fatal 23826 1726867436.26090: checking for max_fail_percentage 23826 1726867436.26092: done checking for max_fail_percentage 23826 1726867436.26092: checking to see if all hosts have failed and the running result is not ok 23826 1726867436.26093: done checking to see if all hosts have failed 23826 1726867436.26094: getting the remaining hosts for this loop 23826 1726867436.26096: done getting the remaining hosts for this loop 23826 1726867436.26099: getting the next task for host managed_node2 23826 1726867436.26105: done getting next task for host managed_node2 23826 1726867436.26107: ^ task is: TASK: Set the connection_failed flag 23826 1726867436.26109: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867436.26113: getting variables 23826 1726867436.26114: in VariableManager get_vars() 23826 1726867436.26155: Calling all_inventory to load vars for managed_node2 23826 1726867436.26158: Calling groups_inventory to load vars for managed_node2 23826 1726867436.26161: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867436.26170: Calling all_plugins_play to load vars for managed_node2 23826 1726867436.26174: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867436.26179: Calling groups_plugins_play to load vars for managed_node2 23826 1726867436.27926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867436.29872: done with get_vars() 23826 1726867436.29894: done getting variables 23826 1726867436.29956: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set the connection_failed flag] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:64 Friday 20 September 2024 17:23:56 -0400 (0:00:00.074) 0:00:18.311 ****** 23826 1726867436.29986: entering _queue_task() for managed_node2/set_fact 23826 1726867436.30717: worker is 1 (out of 1 available) 23826 1726867436.30729: exiting _queue_task() for managed_node2/set_fact 23826 1726867436.30739: done queuing things up, now waiting for results queue to drain 23826 1726867436.30740: waiting for pending results... 23826 1726867436.31546: running TaskExecutor() for managed_node2/TASK: Set the connection_failed flag 23826 1726867436.31639: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000005f 23826 1726867436.31653: variable 'ansible_search_path' from source: unknown 23826 1726867436.31770: calling self._execute() 23826 1726867436.31911: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867436.31917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867436.31935: variable 'omit' from source: magic vars 23826 1726867436.33085: variable 'ansible_distribution_major_version' from source: facts 23826 1726867436.33090: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867436.33096: variable '__network_connections_result' from source: set_fact 23826 1726867436.33098: Evaluated conditional (__network_connections_result.failed): False 23826 1726867436.33101: when evaluation is False, skipping this task 23826 1726867436.33146: _execute() done 23826 1726867436.33149: dumping result to json 23826 1726867436.33151: done dumping result, returning 23826 1726867436.33153: done running TaskExecutor() for managed_node2/TASK: Set the connection_failed flag [0affcac9-a3a5-a92d-a3ea-00000000005f] 23826 1726867436.33155: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000005f 23826 1726867436.33218: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000005f 23826 1726867436.33221: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 23826 1726867436.33271: no more pending results, returning what we have 23826 1726867436.33274: results queue empty 23826 1726867436.33276: checking for any_errors_fatal 23826 1726867436.33283: done checking for any_errors_fatal 23826 1726867436.33283: checking for max_fail_percentage 23826 1726867436.33285: done checking for max_fail_percentage 23826 1726867436.33286: checking to see if all hosts have failed and the running result is not ok 23826 1726867436.33287: done checking to see if all hosts have failed 23826 1726867436.33288: getting the remaining hosts for this loop 23826 1726867436.33289: done getting the remaining hosts for this loop 23826 1726867436.33293: getting the next task for host managed_node2 23826 1726867436.33298: done getting next task for host managed_node2 23826 1726867436.33300: ^ task is: TASK: meta (flush_handlers) 23826 1726867436.33302: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867436.33307: getting variables 23826 1726867436.33308: in VariableManager get_vars() 23826 1726867436.33337: Calling all_inventory to load vars for managed_node2 23826 1726867436.33340: Calling groups_inventory to load vars for managed_node2 23826 1726867436.33342: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867436.33351: Calling all_plugins_play to load vars for managed_node2 23826 1726867436.33354: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867436.33357: Calling groups_plugins_play to load vars for managed_node2 23826 1726867436.41574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867436.44816: done with get_vars() 23826 1726867436.44896: done getting variables 23826 1726867436.45091: in VariableManager get_vars() 23826 1726867436.45106: Calling all_inventory to load vars for managed_node2 23826 1726867436.45109: Calling groups_inventory to load vars for managed_node2 23826 1726867436.45111: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867436.45116: Calling all_plugins_play to load vars for managed_node2 23826 1726867436.45118: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867436.45121: Calling groups_plugins_play to load vars for managed_node2 23826 1726867436.47192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867436.49027: done with get_vars() 23826 1726867436.49053: done queuing things up, now waiting for results queue to drain 23826 1726867436.49055: results queue empty 23826 1726867436.49056: checking for any_errors_fatal 23826 1726867436.49058: done checking for any_errors_fatal 23826 1726867436.49059: checking for max_fail_percentage 23826 1726867436.49060: done checking for max_fail_percentage 23826 1726867436.49061: checking to see if all hosts have failed and the running result is not ok 23826 1726867436.49062: done checking to see if all hosts have failed 23826 1726867436.49062: getting the remaining hosts for this loop 23826 1726867436.49063: done getting the remaining hosts for this loop 23826 1726867436.49066: getting the next task for host managed_node2 23826 1726867436.49069: done getting next task for host managed_node2 23826 1726867436.49071: ^ task is: TASK: meta (flush_handlers) 23826 1726867436.49072: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867436.49075: getting variables 23826 1726867436.49076: in VariableManager get_vars() 23826 1726867436.49107: Calling all_inventory to load vars for managed_node2 23826 1726867436.49110: Calling groups_inventory to load vars for managed_node2 23826 1726867436.49112: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867436.49117: Calling all_plugins_play to load vars for managed_node2 23826 1726867436.49120: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867436.49123: Calling groups_plugins_play to load vars for managed_node2 23826 1726867436.50885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867436.52930: done with get_vars() 23826 1726867436.52951: done getting variables 23826 1726867436.53122: in VariableManager get_vars() 23826 1726867436.53135: Calling all_inventory to load vars for managed_node2 23826 1726867436.53138: Calling groups_inventory to load vars for managed_node2 23826 1726867436.53140: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867436.53145: Calling all_plugins_play to load vars for managed_node2 23826 1726867436.53147: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867436.53150: Calling groups_plugins_play to load vars for managed_node2 23826 1726867436.55438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867436.58909: done with get_vars() 23826 1726867436.58934: done queuing things up, now waiting for results queue to drain 23826 1726867436.58936: results queue empty 23826 1726867436.58937: checking for any_errors_fatal 23826 1726867436.58938: done checking for any_errors_fatal 23826 1726867436.58939: checking for max_fail_percentage 23826 1726867436.58940: done checking for max_fail_percentage 23826 1726867436.58941: checking to see if all hosts have failed and the running result is not ok 23826 1726867436.58941: done checking to see if all hosts have failed 23826 1726867436.58942: getting the remaining hosts for this loop 23826 1726867436.58943: done getting the remaining hosts for this loop 23826 1726867436.58946: getting the next task for host managed_node2 23826 1726867436.58954: done getting next task for host managed_node2 23826 1726867436.58955: ^ task is: None 23826 1726867436.58957: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867436.58958: done queuing things up, now waiting for results queue to drain 23826 1726867436.58959: results queue empty 23826 1726867436.58960: checking for any_errors_fatal 23826 1726867436.58960: done checking for any_errors_fatal 23826 1726867436.58961: checking for max_fail_percentage 23826 1726867436.58962: done checking for max_fail_percentage 23826 1726867436.58963: checking to see if all hosts have failed and the running result is not ok 23826 1726867436.58963: done checking to see if all hosts have failed 23826 1726867436.58965: getting the next task for host managed_node2 23826 1726867436.58967: done getting next task for host managed_node2 23826 1726867436.58968: ^ task is: None 23826 1726867436.58969: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867436.59246: in VariableManager get_vars() 23826 1726867436.59266: done with get_vars() 23826 1726867436.59271: in VariableManager get_vars() 23826 1726867436.59286: done with get_vars() 23826 1726867436.59291: variable 'omit' from source: magic vars 23826 1726867436.59500: variable 'profile' from source: play vars 23826 1726867436.59718: in VariableManager get_vars() 23826 1726867436.59731: done with get_vars() 23826 1726867436.59751: variable 'omit' from source: magic vars 23826 1726867436.59930: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 23826 1726867436.61306: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 23826 1726867436.61329: getting the remaining hosts for this loop 23826 1726867436.61331: done getting the remaining hosts for this loop 23826 1726867436.61333: getting the next task for host managed_node2 23826 1726867436.61493: done getting next task for host managed_node2 23826 1726867436.61495: ^ task is: TASK: Gathering Facts 23826 1726867436.61497: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867436.61499: getting variables 23826 1726867436.61500: in VariableManager get_vars() 23826 1726867436.61516: Calling all_inventory to load vars for managed_node2 23826 1726867436.61519: Calling groups_inventory to load vars for managed_node2 23826 1726867436.61520: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867436.61526: Calling all_plugins_play to load vars for managed_node2 23826 1726867436.61528: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867436.61531: Calling groups_plugins_play to load vars for managed_node2 23826 1726867436.63747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867436.66947: done with get_vars() 23826 1726867436.67086: done getting variables 23826 1726867436.67129: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 17:23:56 -0400 (0:00:00.371) 0:00:18.682 ****** 23826 1726867436.67154: entering _queue_task() for managed_node2/gather_facts 23826 1726867436.67897: worker is 1 (out of 1 available) 23826 1726867436.67908: exiting _queue_task() for managed_node2/gather_facts 23826 1726867436.67918: done queuing things up, now waiting for results queue to drain 23826 1726867436.67920: waiting for pending results... 23826 1726867436.68600: running TaskExecutor() for managed_node2/TASK: Gathering Facts 23826 1726867436.69073: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000454 23826 1726867436.69091: variable 'ansible_search_path' from source: unknown 23826 1726867436.69130: calling self._execute() 23826 1726867436.69455: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867436.69462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867436.69470: variable 'omit' from source: magic vars 23826 1726867436.70506: variable 'ansible_distribution_major_version' from source: facts 23826 1726867436.70528: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867436.70547: variable 'omit' from source: magic vars 23826 1726867436.70580: variable 'omit' from source: magic vars 23826 1726867436.70689: variable 'omit' from source: magic vars 23826 1726867436.70739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867436.70922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867436.70946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867436.70998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867436.71183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867436.71193: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867436.71197: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867436.71200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867436.71432: Set connection var ansible_timeout to 10 23826 1726867436.71448: Set connection var ansible_shell_executable to /bin/sh 23826 1726867436.71464: Set connection var ansible_connection to ssh 23826 1726867436.71480: Set connection var ansible_pipelining to False 23826 1726867436.71488: Set connection var ansible_shell_type to sh 23826 1726867436.71497: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867436.71604: variable 'ansible_shell_executable' from source: unknown 23826 1726867436.71617: variable 'ansible_connection' from source: unknown 23826 1726867436.71638: variable 'ansible_module_compression' from source: unknown 23826 1726867436.71651: variable 'ansible_shell_type' from source: unknown 23826 1726867436.71851: variable 'ansible_shell_executable' from source: unknown 23826 1726867436.71854: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867436.71856: variable 'ansible_pipelining' from source: unknown 23826 1726867436.71859: variable 'ansible_timeout' from source: unknown 23826 1726867436.71861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867436.72067: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867436.72287: variable 'omit' from source: magic vars 23826 1726867436.72292: starting attempt loop 23826 1726867436.72295: running the handler 23826 1726867436.72297: variable 'ansible_facts' from source: unknown 23826 1726867436.72300: _low_level_execute_command(): starting 23826 1726867436.72302: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867436.73848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867436.73896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867436.74259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867436.74262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867436.74429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867436.74598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867436.76334: stdout chunk (state=3): >>>/root <<< 23826 1726867436.76439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867436.76457: stderr chunk (state=3): >>><<< 23826 1726867436.76467: stdout chunk (state=3): >>><<< 23826 1726867436.76498: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867436.76521: _low_level_execute_command(): starting 23826 1726867436.76539: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790 `" && echo ansible-tmp-1726867436.7650478-24791-211348123567790="` echo /root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790 `" ) && sleep 0' 23826 1726867436.77148: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867436.77199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867436.77224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867436.77306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867436.77322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867436.77345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867436.77421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867436.79496: stdout chunk (state=3): >>>ansible-tmp-1726867436.7650478-24791-211348123567790=/root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790 <<< 23826 1726867436.79593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867436.79673: stderr chunk (state=3): >>><<< 23826 1726867436.79685: stdout chunk (state=3): >>><<< 23826 1726867436.79991: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867436.7650478-24791-211348123567790=/root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867436.79994: variable 'ansible_module_compression' from source: unknown 23826 1726867436.79997: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 23826 1726867436.79999: variable 'ansible_facts' from source: unknown 23826 1726867436.80493: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/AnsiballZ_setup.py 23826 1726867436.80828: Sending initial data 23826 1726867436.80870: Sent initial data (154 bytes) 23826 1726867436.81473: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867436.81525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867436.81542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867436.81584: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867436.81656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867436.81674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867436.81700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867436.81793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867436.83812: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867436.83841: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867436.83903: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpdloyzahu /root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/AnsiballZ_setup.py <<< 23826 1726867436.83906: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/AnsiballZ_setup.py" <<< 23826 1726867436.83964: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpdloyzahu" to remote "/root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/AnsiballZ_setup.py" <<< 23826 1726867436.85514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867436.85560: stderr chunk (state=3): >>><<< 23826 1726867436.85569: stdout chunk (state=3): >>><<< 23826 1726867436.85594: done transferring module to remote 23826 1726867436.85611: _low_level_execute_command(): starting 23826 1726867436.85620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/ /root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/AnsiballZ_setup.py && sleep 0' 23826 1726867436.86245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867436.86248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867436.86250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867436.86252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867436.86254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867436.86257: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867436.86259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867436.86348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867436.86387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867436.88300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867436.88314: stderr chunk (state=3): >>><<< 23826 1726867436.88317: stdout chunk (state=3): >>><<< 23826 1726867436.88328: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867436.88331: _low_level_execute_command(): starting 23826 1726867436.88360: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/AnsiballZ_setup.py && sleep 0' 23826 1726867436.89066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867436.89085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867436.89102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867436.89146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867436.89222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867437.54544: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.48779296875, "5m": 0.39013671875, "15m": 0.220703125}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "23", "second": "57", "epoch": "1726867437", "epoch_int": "1726867437", "date": "2024-09-20", "time": "17:23:57", "iso8601_micro": "2024-09-20T21:23:57.176862Z", "iso8601": "2024-09-20T21:23:57Z", "iso8601_basic": "20240920T172357176862", "iso8601_basic_short": "20240920T172357", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["ethtest0", "lo", "eth0", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:b4:26:aa:e3:d8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4b4:26ff:feaa:e3d8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ba:29:a4:e0:1c:3b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b829:a4ff:fee0:1c3b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::f4b4:26ff:feaa:e3d8", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b", "fe80::f4b4:26ff:feaa:e3d8"]}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 675, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794439168, "block_size": 4096, "block_total": 65519099, "block_available": 63914658, "block_used": 1604441, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 23826 1726867437.56572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867437.56588: stdout chunk (state=3): >>><<< 23826 1726867437.56773: stderr chunk (state=3): >>><<< 23826 1726867437.56780: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.48779296875, "5m": 0.39013671875, "15m": 0.220703125}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "23", "second": "57", "epoch": "1726867437", "epoch_int": "1726867437", "date": "2024-09-20", "time": "17:23:57", "iso8601_micro": "2024-09-20T21:23:57.176862Z", "iso8601": "2024-09-20T21:23:57Z", "iso8601_basic": "20240920T172357176862", "iso8601_basic_short": "20240920T172357", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["ethtest0", "lo", "eth0", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:b4:26:aa:e3:d8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4b4:26ff:feaa:e3d8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ba:29:a4:e0:1c:3b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b829:a4ff:fee0:1c3b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::f4b4:26ff:feaa:e3d8", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b", "fe80::f4b4:26ff:feaa:e3d8"]}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 675, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794439168, "block_size": 4096, "block_total": 65519099, "block_available": 63914658, "block_used": 1604441, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867437.57782: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867437.57789: _low_level_execute_command(): starting 23826 1726867437.57912: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867436.7650478-24791-211348123567790/ > /dev/null 2>&1 && sleep 0' 23826 1726867437.59025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867437.59029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867437.59032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867437.59034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867437.59223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867437.59281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867437.61291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867437.61295: stdout chunk (state=3): >>><<< 23826 1726867437.61297: stderr chunk (state=3): >>><<< 23826 1726867437.61313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867437.61342: handler run complete 23826 1726867437.61516: variable 'ansible_facts' from source: unknown 23826 1726867437.61685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867437.62095: variable 'ansible_facts' from source: unknown 23826 1726867437.62206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867437.62460: attempt loop complete, returning result 23826 1726867437.62469: _execute() done 23826 1726867437.62518: dumping result to json 23826 1726867437.62528: done dumping result, returning 23826 1726867437.62541: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-a92d-a3ea-000000000454] 23826 1726867437.62550: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000454 ok: [managed_node2] 23826 1726867437.64010: no more pending results, returning what we have 23826 1726867437.64014: results queue empty 23826 1726867437.64015: checking for any_errors_fatal 23826 1726867437.64017: done checking for any_errors_fatal 23826 1726867437.64017: checking for max_fail_percentage 23826 1726867437.64019: done checking for max_fail_percentage 23826 1726867437.64019: checking to see if all hosts have failed and the running result is not ok 23826 1726867437.64020: done checking to see if all hosts have failed 23826 1726867437.64021: getting the remaining hosts for this loop 23826 1726867437.64022: done getting the remaining hosts for this loop 23826 1726867437.64025: getting the next task for host managed_node2 23826 1726867437.64037: done getting next task for host managed_node2 23826 1726867437.64039: ^ task is: TASK: meta (flush_handlers) 23826 1726867437.64041: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867437.64046: getting variables 23826 1726867437.64047: in VariableManager get_vars() 23826 1726867437.64076: Calling all_inventory to load vars for managed_node2 23826 1726867437.64081: Calling groups_inventory to load vars for managed_node2 23826 1726867437.64084: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867437.64090: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000454 23826 1726867437.64093: WORKER PROCESS EXITING 23826 1726867437.64103: Calling all_plugins_play to load vars for managed_node2 23826 1726867437.64106: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867437.64112: Calling groups_plugins_play to load vars for managed_node2 23826 1726867437.65774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867437.67525: done with get_vars() 23826 1726867437.67545: done getting variables 23826 1726867437.67625: in VariableManager get_vars() 23826 1726867437.67637: Calling all_inventory to load vars for managed_node2 23826 1726867437.67640: Calling groups_inventory to load vars for managed_node2 23826 1726867437.67641: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867437.67646: Calling all_plugins_play to load vars for managed_node2 23826 1726867437.67648: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867437.67651: Calling groups_plugins_play to load vars for managed_node2 23826 1726867437.69728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867437.73321: done with get_vars() 23826 1726867437.73469: done queuing things up, now waiting for results queue to drain 23826 1726867437.73472: results queue empty 23826 1726867437.73473: checking for any_errors_fatal 23826 1726867437.73480: done checking for any_errors_fatal 23826 1726867437.73481: checking for max_fail_percentage 23826 1726867437.73482: done checking for max_fail_percentage 23826 1726867437.73482: checking to see if all hosts have failed and the running result is not ok 23826 1726867437.73488: done checking to see if all hosts have failed 23826 1726867437.73489: getting the remaining hosts for this loop 23826 1726867437.73490: done getting the remaining hosts for this loop 23826 1726867437.73493: getting the next task for host managed_node2 23826 1726867437.73498: done getting next task for host managed_node2 23826 1726867437.73501: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 23826 1726867437.73503: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867437.73513: getting variables 23826 1726867437.73515: in VariableManager get_vars() 23826 1726867437.73531: Calling all_inventory to load vars for managed_node2 23826 1726867437.73533: Calling groups_inventory to load vars for managed_node2 23826 1726867437.73535: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867437.73541: Calling all_plugins_play to load vars for managed_node2 23826 1726867437.73662: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867437.73668: Calling groups_plugins_play to load vars for managed_node2 23826 1726867437.76030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867437.79323: done with get_vars() 23826 1726867437.79345: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:23:57 -0400 (0:00:01.123) 0:00:19.806 ****** 23826 1726867437.79545: entering _queue_task() for managed_node2/include_tasks 23826 1726867437.80282: worker is 1 (out of 1 available) 23826 1726867437.80295: exiting _queue_task() for managed_node2/include_tasks 23826 1726867437.80307: done queuing things up, now waiting for results queue to drain 23826 1726867437.80308: waiting for pending results... 23826 1726867437.80680: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 23826 1726867437.80992: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000067 23826 1726867437.81029: variable 'ansible_search_path' from source: unknown 23826 1726867437.81035: variable 'ansible_search_path' from source: unknown 23826 1726867437.81050: calling self._execute() 23826 1726867437.81329: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867437.81334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867437.81336: variable 'omit' from source: magic vars 23826 1726867437.82096: variable 'ansible_distribution_major_version' from source: facts 23826 1726867437.82107: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867437.82222: variable 'connection_failed' from source: set_fact 23826 1726867437.82343: Evaluated conditional (not connection_failed): True 23826 1726867437.82523: variable 'ansible_distribution_major_version' from source: facts 23826 1726867437.82526: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867437.82739: variable 'connection_failed' from source: set_fact 23826 1726867437.82743: Evaluated conditional (not connection_failed): True 23826 1726867437.82745: _execute() done 23826 1726867437.82748: dumping result to json 23826 1726867437.82753: done dumping result, returning 23826 1726867437.82760: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-a92d-a3ea-000000000067] 23826 1726867437.82886: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000067 23826 1726867437.83024: no more pending results, returning what we have 23826 1726867437.83030: in VariableManager get_vars() 23826 1726867437.83288: Calling all_inventory to load vars for managed_node2 23826 1726867437.83291: Calling groups_inventory to load vars for managed_node2 23826 1726867437.83293: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867437.83302: Calling all_plugins_play to load vars for managed_node2 23826 1726867437.83305: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867437.83307: Calling groups_plugins_play to load vars for managed_node2 23826 1726867437.84011: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000067 23826 1726867437.84014: WORKER PROCESS EXITING 23826 1726867437.86129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867437.89455: done with get_vars() 23826 1726867437.89473: variable 'ansible_search_path' from source: unknown 23826 1726867437.89474: variable 'ansible_search_path' from source: unknown 23826 1726867437.89503: we have included files to process 23826 1726867437.89504: generating all_blocks data 23826 1726867437.89505: done generating all_blocks data 23826 1726867437.89506: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867437.89507: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867437.89509: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867437.90847: done processing included file 23826 1726867437.90850: iterating over new_blocks loaded from include file 23826 1726867437.90852: in VariableManager get_vars() 23826 1726867437.90874: done with get_vars() 23826 1726867437.90876: filtering new block on tags 23826 1726867437.90894: done filtering new block on tags 23826 1726867437.90897: in VariableManager get_vars() 23826 1726867437.90916: done with get_vars() 23826 1726867437.90918: filtering new block on tags 23826 1726867437.90936: done filtering new block on tags 23826 1726867437.90939: in VariableManager get_vars() 23826 1726867437.91078: done with get_vars() 23826 1726867437.91081: filtering new block on tags 23826 1726867437.91097: done filtering new block on tags 23826 1726867437.91099: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 23826 1726867437.91105: extending task lists for all hosts with included blocks 23826 1726867437.92001: done extending task lists 23826 1726867437.92002: done processing included files 23826 1726867437.92003: results queue empty 23826 1726867437.92004: checking for any_errors_fatal 23826 1726867437.92005: done checking for any_errors_fatal 23826 1726867437.92006: checking for max_fail_percentage 23826 1726867437.92007: done checking for max_fail_percentage 23826 1726867437.92007: checking to see if all hosts have failed and the running result is not ok 23826 1726867437.92008: done checking to see if all hosts have failed 23826 1726867437.92009: getting the remaining hosts for this loop 23826 1726867437.92010: done getting the remaining hosts for this loop 23826 1726867437.92014: getting the next task for host managed_node2 23826 1726867437.92018: done getting next task for host managed_node2 23826 1726867437.92020: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 23826 1726867437.92022: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867437.92036: getting variables 23826 1726867437.92037: in VariableManager get_vars() 23826 1726867437.92057: Calling all_inventory to load vars for managed_node2 23826 1726867437.92059: Calling groups_inventory to load vars for managed_node2 23826 1726867437.92061: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867437.92067: Calling all_plugins_play to load vars for managed_node2 23826 1726867437.92069: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867437.92072: Calling groups_plugins_play to load vars for managed_node2 23826 1726867437.94109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867437.95861: done with get_vars() 23826 1726867437.95881: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:23:57 -0400 (0:00:00.164) 0:00:19.970 ****** 23826 1726867437.95952: entering _queue_task() for managed_node2/setup 23826 1726867437.96347: worker is 1 (out of 1 available) 23826 1726867437.96358: exiting _queue_task() for managed_node2/setup 23826 1726867437.96368: done queuing things up, now waiting for results queue to drain 23826 1726867437.96369: waiting for pending results... 23826 1726867437.96646: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 23826 1726867437.96803: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000495 23826 1726867437.96833: variable 'ansible_search_path' from source: unknown 23826 1726867437.96843: variable 'ansible_search_path' from source: unknown 23826 1726867437.96893: calling self._execute() 23826 1726867437.96997: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867437.97011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867437.97037: variable 'omit' from source: magic vars 23826 1726867437.98185: variable 'ansible_distribution_major_version' from source: facts 23826 1726867437.98188: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867437.98384: variable 'connection_failed' from source: set_fact 23826 1726867437.98387: Evaluated conditional (not connection_failed): True 23826 1726867437.98520: variable 'ansible_distribution_major_version' from source: facts 23826 1726867437.98546: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867437.98733: variable 'connection_failed' from source: set_fact 23826 1726867437.98737: Evaluated conditional (not connection_failed): True 23826 1726867437.98959: variable 'ansible_distribution_major_version' from source: facts 23826 1726867437.98963: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867437.99102: variable 'connection_failed' from source: set_fact 23826 1726867437.99106: Evaluated conditional (not connection_failed): True 23826 1726867437.99239: variable 'ansible_distribution_major_version' from source: facts 23826 1726867437.99253: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867437.99355: variable 'connection_failed' from source: set_fact 23826 1726867437.99361: Evaluated conditional (not connection_failed): True 23826 1726867437.99595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867438.03613: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867438.03986: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867438.03989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867438.03992: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867438.03994: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867438.03997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867438.03999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867438.04001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867438.04003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867438.04005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867438.04280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867438.04305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867438.04328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867438.04376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867438.04394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867438.04562: variable '__network_required_facts' from source: role '' defaults 23826 1726867438.04585: variable 'ansible_facts' from source: unknown 23826 1726867438.05891: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 23826 1726867438.05899: when evaluation is False, skipping this task 23826 1726867438.05910: _execute() done 23826 1726867438.05917: dumping result to json 23826 1726867438.05925: done dumping result, returning 23826 1726867438.05936: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-a92d-a3ea-000000000495] 23826 1726867438.05945: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000495 23826 1726867438.06064: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000495 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867438.06113: no more pending results, returning what we have 23826 1726867438.06116: results queue empty 23826 1726867438.06117: checking for any_errors_fatal 23826 1726867438.06118: done checking for any_errors_fatal 23826 1726867438.06119: checking for max_fail_percentage 23826 1726867438.06121: done checking for max_fail_percentage 23826 1726867438.06121: checking to see if all hosts have failed and the running result is not ok 23826 1726867438.06122: done checking to see if all hosts have failed 23826 1726867438.06123: getting the remaining hosts for this loop 23826 1726867438.06124: done getting the remaining hosts for this loop 23826 1726867438.06128: getting the next task for host managed_node2 23826 1726867438.06136: done getting next task for host managed_node2 23826 1726867438.06139: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 23826 1726867438.06142: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867438.06160: getting variables 23826 1726867438.06161: in VariableManager get_vars() 23826 1726867438.06209: Calling all_inventory to load vars for managed_node2 23826 1726867438.06212: Calling groups_inventory to load vars for managed_node2 23826 1726867438.06220: WORKER PROCESS EXITING 23826 1726867438.06321: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867438.06396: Calling all_plugins_play to load vars for managed_node2 23826 1726867438.06399: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867438.06402: Calling groups_plugins_play to load vars for managed_node2 23826 1726867438.07875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867438.09523: done with get_vars() 23826 1726867438.09550: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:23:58 -0400 (0:00:00.136) 0:00:20.107 ****** 23826 1726867438.09657: entering _queue_task() for managed_node2/stat 23826 1726867438.09963: worker is 1 (out of 1 available) 23826 1726867438.10089: exiting _queue_task() for managed_node2/stat 23826 1726867438.10101: done queuing things up, now waiting for results queue to drain 23826 1726867438.10102: waiting for pending results... 23826 1726867438.10295: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 23826 1726867438.10583: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000497 23826 1726867438.10587: variable 'ansible_search_path' from source: unknown 23826 1726867438.10591: variable 'ansible_search_path' from source: unknown 23826 1726867438.10594: calling self._execute() 23826 1726867438.10596: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867438.10598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867438.10601: variable 'omit' from source: magic vars 23826 1726867438.10989: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.11001: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.11112: variable 'connection_failed' from source: set_fact 23826 1726867438.11116: Evaluated conditional (not connection_failed): True 23826 1726867438.11228: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.11240: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.11355: variable 'connection_failed' from source: set_fact 23826 1726867438.11366: Evaluated conditional (not connection_failed): True 23826 1726867438.11494: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.11514: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.11622: variable 'connection_failed' from source: set_fact 23826 1726867438.11640: Evaluated conditional (not connection_failed): True 23826 1726867438.11783: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.11787: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.11885: variable 'connection_failed' from source: set_fact 23826 1726867438.11953: Evaluated conditional (not connection_failed): True 23826 1726867438.12099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867438.12404: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867438.12455: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867438.12508: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867438.12545: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867438.12716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867438.12732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867438.12764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867438.12825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867438.12899: variable '__network_is_ostree' from source: set_fact 23826 1726867438.12912: Evaluated conditional (not __network_is_ostree is defined): False 23826 1726867438.12933: when evaluation is False, skipping this task 23826 1726867438.12936: _execute() done 23826 1726867438.12983: dumping result to json 23826 1726867438.12987: done dumping result, returning 23826 1726867438.12989: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-a92d-a3ea-000000000497] 23826 1726867438.12991: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000497 23826 1726867438.13225: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000497 23826 1726867438.13229: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 23826 1726867438.13292: no more pending results, returning what we have 23826 1726867438.13297: results queue empty 23826 1726867438.13298: checking for any_errors_fatal 23826 1726867438.13304: done checking for any_errors_fatal 23826 1726867438.13305: checking for max_fail_percentage 23826 1726867438.13307: done checking for max_fail_percentage 23826 1726867438.13307: checking to see if all hosts have failed and the running result is not ok 23826 1726867438.13308: done checking to see if all hosts have failed 23826 1726867438.13309: getting the remaining hosts for this loop 23826 1726867438.13311: done getting the remaining hosts for this loop 23826 1726867438.13315: getting the next task for host managed_node2 23826 1726867438.13321: done getting next task for host managed_node2 23826 1726867438.13326: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 23826 1726867438.13329: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867438.13391: getting variables 23826 1726867438.13394: in VariableManager get_vars() 23826 1726867438.13433: Calling all_inventory to load vars for managed_node2 23826 1726867438.13437: Calling groups_inventory to load vars for managed_node2 23826 1726867438.13439: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867438.13564: Calling all_plugins_play to load vars for managed_node2 23826 1726867438.13568: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867438.13571: Calling groups_plugins_play to load vars for managed_node2 23826 1726867438.14984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867438.16671: done with get_vars() 23826 1726867438.16695: done getting variables 23826 1726867438.16764: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:23:58 -0400 (0:00:00.071) 0:00:20.179 ****** 23826 1726867438.16802: entering _queue_task() for managed_node2/set_fact 23826 1726867438.17152: worker is 1 (out of 1 available) 23826 1726867438.17165: exiting _queue_task() for managed_node2/set_fact 23826 1726867438.17176: done queuing things up, now waiting for results queue to drain 23826 1726867438.17182: waiting for pending results... 23826 1726867438.17482: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 23826 1726867438.17717: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000498 23826 1726867438.17720: variable 'ansible_search_path' from source: unknown 23826 1726867438.17723: variable 'ansible_search_path' from source: unknown 23826 1726867438.17725: calling self._execute() 23826 1726867438.17840: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867438.17853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867438.17869: variable 'omit' from source: magic vars 23826 1726867438.18271: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.18291: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.18414: variable 'connection_failed' from source: set_fact 23826 1726867438.18426: Evaluated conditional (not connection_failed): True 23826 1726867438.18544: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.18555: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.18662: variable 'connection_failed' from source: set_fact 23826 1726867438.18673: Evaluated conditional (not connection_failed): True 23826 1726867438.18793: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.18883: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.18922: variable 'connection_failed' from source: set_fact 23826 1726867438.18933: Evaluated conditional (not connection_failed): True 23826 1726867438.19085: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.19108: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.19243: variable 'connection_failed' from source: set_fact 23826 1726867438.19254: Evaluated conditional (not connection_failed): True 23826 1726867438.19444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867438.19727: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867438.19865: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867438.19868: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867438.19870: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867438.19989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867438.20024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867438.20054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867438.20086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867438.20175: variable '__network_is_ostree' from source: set_fact 23826 1726867438.20192: Evaluated conditional (not __network_is_ostree is defined): False 23826 1726867438.20199: when evaluation is False, skipping this task 23826 1726867438.20215: _execute() done 23826 1726867438.20218: dumping result to json 23826 1726867438.20299: done dumping result, returning 23826 1726867438.20302: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-a92d-a3ea-000000000498] 23826 1726867438.20305: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000498 23826 1726867438.20379: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000498 23826 1726867438.20383: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 23826 1726867438.20457: no more pending results, returning what we have 23826 1726867438.20461: results queue empty 23826 1726867438.20462: checking for any_errors_fatal 23826 1726867438.20470: done checking for any_errors_fatal 23826 1726867438.20471: checking for max_fail_percentage 23826 1726867438.20473: done checking for max_fail_percentage 23826 1726867438.20474: checking to see if all hosts have failed and the running result is not ok 23826 1726867438.20475: done checking to see if all hosts have failed 23826 1726867438.20476: getting the remaining hosts for this loop 23826 1726867438.20479: done getting the remaining hosts for this loop 23826 1726867438.20483: getting the next task for host managed_node2 23826 1726867438.20492: done getting next task for host managed_node2 23826 1726867438.20496: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 23826 1726867438.20498: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867438.20514: getting variables 23826 1726867438.20516: in VariableManager get_vars() 23826 1726867438.20559: Calling all_inventory to load vars for managed_node2 23826 1726867438.20562: Calling groups_inventory to load vars for managed_node2 23826 1726867438.20565: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867438.20576: Calling all_plugins_play to load vars for managed_node2 23826 1726867438.20755: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867438.20759: Calling groups_plugins_play to load vars for managed_node2 23826 1726867438.22334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867438.26147: done with get_vars() 23826 1726867438.26183: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:23:58 -0400 (0:00:00.095) 0:00:20.275 ****** 23826 1726867438.26368: entering _queue_task() for managed_node2/service_facts 23826 1726867438.27315: worker is 1 (out of 1 available) 23826 1726867438.27326: exiting _queue_task() for managed_node2/service_facts 23826 1726867438.27335: done queuing things up, now waiting for results queue to drain 23826 1726867438.27337: waiting for pending results... 23826 1726867438.27998: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 23826 1726867438.28004: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000049a 23826 1726867438.28007: variable 'ansible_search_path' from source: unknown 23826 1726867438.28010: variable 'ansible_search_path' from source: unknown 23826 1726867438.28012: calling self._execute() 23826 1726867438.28147: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867438.28203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867438.28207: variable 'omit' from source: magic vars 23826 1726867438.28943: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.28955: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.29552: variable 'connection_failed' from source: set_fact 23826 1726867438.29556: Evaluated conditional (not connection_failed): True 23826 1726867438.29978: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.29982: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.29984: variable 'connection_failed' from source: set_fact 23826 1726867438.29986: Evaluated conditional (not connection_failed): True 23826 1726867438.30400: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.30405: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.30852: variable 'connection_failed' from source: set_fact 23826 1726867438.30856: Evaluated conditional (not connection_failed): True 23826 1726867438.30859: variable 'ansible_distribution_major_version' from source: facts 23826 1726867438.30861: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867438.31118: variable 'connection_failed' from source: set_fact 23826 1726867438.31122: Evaluated conditional (not connection_failed): True 23826 1726867438.31128: variable 'omit' from source: magic vars 23826 1726867438.31383: variable 'omit' from source: magic vars 23826 1726867438.31389: variable 'omit' from source: magic vars 23826 1726867438.31392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867438.31566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867438.31589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867438.31606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867438.31621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867438.31765: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867438.31769: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867438.31771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867438.32083: Set connection var ansible_timeout to 10 23826 1726867438.32086: Set connection var ansible_shell_executable to /bin/sh 23826 1726867438.32089: Set connection var ansible_connection to ssh 23826 1726867438.32092: Set connection var ansible_pipelining to False 23826 1726867438.32094: Set connection var ansible_shell_type to sh 23826 1726867438.32096: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867438.32098: variable 'ansible_shell_executable' from source: unknown 23826 1726867438.32101: variable 'ansible_connection' from source: unknown 23826 1726867438.32103: variable 'ansible_module_compression' from source: unknown 23826 1726867438.32105: variable 'ansible_shell_type' from source: unknown 23826 1726867438.32107: variable 'ansible_shell_executable' from source: unknown 23826 1726867438.32109: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867438.32111: variable 'ansible_pipelining' from source: unknown 23826 1726867438.32112: variable 'ansible_timeout' from source: unknown 23826 1726867438.32115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867438.32483: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867438.32496: variable 'omit' from source: magic vars 23826 1726867438.32499: starting attempt loop 23826 1726867438.32501: running the handler 23826 1726867438.32603: _low_level_execute_command(): starting 23826 1726867438.32639: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867438.33919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867438.34053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867438.34191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867438.34215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867438.34228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867438.34298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867438.35996: stdout chunk (state=3): >>>/root <<< 23826 1726867438.36095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867438.36400: stderr chunk (state=3): >>><<< 23826 1726867438.36403: stdout chunk (state=3): >>><<< 23826 1726867438.36425: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867438.36440: _low_level_execute_command(): starting 23826 1726867438.36447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362 `" && echo ansible-tmp-1726867438.3642507-24869-231818920770362="` echo /root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362 `" ) && sleep 0' 23826 1726867438.37493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867438.37665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867438.37697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867438.37702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867438.37708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867438.37720: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867438.37782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867438.37786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867438.37788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867438.37790: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867438.37797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867438.37803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867438.37806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867438.37808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867438.37875: stderr chunk (state=3): >>>debug2: match found <<< 23826 1726867438.37884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867438.38054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867438.38096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867438.40153: stdout chunk (state=3): >>>ansible-tmp-1726867438.3642507-24869-231818920770362=/root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362 <<< 23826 1726867438.40246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867438.40249: stdout chunk (state=3): >>><<< 23826 1726867438.40251: stderr chunk (state=3): >>><<< 23826 1726867438.40254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867438.3642507-24869-231818920770362=/root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867438.40425: variable 'ansible_module_compression' from source: unknown 23826 1726867438.40428: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 23826 1726867438.40430: variable 'ansible_facts' from source: unknown 23826 1726867438.40597: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/AnsiballZ_service_facts.py 23826 1726867438.40837: Sending initial data 23826 1726867438.40841: Sent initial data (162 bytes) 23826 1726867438.41396: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867438.41515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867438.41519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867438.41611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867438.41684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867438.43325: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867438.43363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867438.43406: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpml8x_m1c /root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/AnsiballZ_service_facts.py <<< 23826 1726867438.43409: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/AnsiballZ_service_facts.py" <<< 23826 1726867438.43543: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpml8x_m1c" to remote "/root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/AnsiballZ_service_facts.py" <<< 23826 1726867438.45124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867438.45193: stderr chunk (state=3): >>><<< 23826 1726867438.45196: stdout chunk (state=3): >>><<< 23826 1726867438.45198: done transferring module to remote 23826 1726867438.45223: _low_level_execute_command(): starting 23826 1726867438.45227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/ /root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/AnsiballZ_service_facts.py && sleep 0' 23826 1726867438.46283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867438.46337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867438.46794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867438.46842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867438.48742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867438.48761: stderr chunk (state=3): >>><<< 23826 1726867438.48770: stdout chunk (state=3): >>><<< 23826 1726867438.48794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867438.48804: _low_level_execute_command(): starting 23826 1726867438.48815: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/AnsiballZ_service_facts.py && sleep 0' 23826 1726867438.49991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867438.50172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867438.50261: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867438.50281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867438.50296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867438.50385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867440.10916: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 23826 1726867440.11001: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 23826 1726867440.12796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867440.12859: stdout chunk (state=3): >>><<< 23826 1726867440.12863: stderr chunk (state=3): >>><<< 23826 1726867440.13285: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867440.15660: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867440.15709: _low_level_execute_command(): starting 23826 1726867440.15723: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867438.3642507-24869-231818920770362/ > /dev/null 2>&1 && sleep 0' 23826 1726867440.16901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867440.16905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.16907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867440.16909: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867440.16912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.17083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867440.17184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867440.17261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867440.19185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867440.19203: stdout chunk (state=3): >>><<< 23826 1726867440.19216: stderr chunk (state=3): >>><<< 23826 1726867440.19235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867440.19245: handler run complete 23826 1726867440.19453: variable 'ansible_facts' from source: unknown 23826 1726867440.19607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867440.20125: variable 'ansible_facts' from source: unknown 23826 1726867440.20284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867440.20487: attempt loop complete, returning result 23826 1726867440.20506: _execute() done 23826 1726867440.20582: dumping result to json 23826 1726867440.20586: done dumping result, returning 23826 1726867440.20592: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-a92d-a3ea-00000000049a] 23826 1726867440.20601: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000049a 23826 1726867440.22041: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000049a 23826 1726867440.22044: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867440.22159: no more pending results, returning what we have 23826 1726867440.22161: results queue empty 23826 1726867440.22162: checking for any_errors_fatal 23826 1726867440.22165: done checking for any_errors_fatal 23826 1726867440.22166: checking for max_fail_percentage 23826 1726867440.22167: done checking for max_fail_percentage 23826 1726867440.22168: checking to see if all hosts have failed and the running result is not ok 23826 1726867440.22169: done checking to see if all hosts have failed 23826 1726867440.22169: getting the remaining hosts for this loop 23826 1726867440.22170: done getting the remaining hosts for this loop 23826 1726867440.22174: getting the next task for host managed_node2 23826 1726867440.22180: done getting next task for host managed_node2 23826 1726867440.22183: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 23826 1726867440.22186: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867440.22195: getting variables 23826 1726867440.22197: in VariableManager get_vars() 23826 1726867440.22226: Calling all_inventory to load vars for managed_node2 23826 1726867440.22229: Calling groups_inventory to load vars for managed_node2 23826 1726867440.22231: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867440.22239: Calling all_plugins_play to load vars for managed_node2 23826 1726867440.22241: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867440.22244: Calling groups_plugins_play to load vars for managed_node2 23826 1726867440.23404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867440.24922: done with get_vars() 23826 1726867440.24950: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:24:00 -0400 (0:00:01.986) 0:00:22.261 ****** 23826 1726867440.25042: entering _queue_task() for managed_node2/package_facts 23826 1726867440.25423: worker is 1 (out of 1 available) 23826 1726867440.25438: exiting _queue_task() for managed_node2/package_facts 23826 1726867440.25453: done queuing things up, now waiting for results queue to drain 23826 1726867440.25455: waiting for pending results... 23826 1726867440.25897: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 23826 1726867440.25922: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000049b 23826 1726867440.25946: variable 'ansible_search_path' from source: unknown 23826 1726867440.25955: variable 'ansible_search_path' from source: unknown 23826 1726867440.26001: calling self._execute() 23826 1726867440.26114: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867440.26132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867440.26149: variable 'omit' from source: magic vars 23826 1726867440.26542: variable 'ansible_distribution_major_version' from source: facts 23826 1726867440.26565: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867440.26686: variable 'connection_failed' from source: set_fact 23826 1726867440.26781: Evaluated conditional (not connection_failed): True 23826 1726867440.26816: variable 'ansible_distribution_major_version' from source: facts 23826 1726867440.26828: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867440.26932: variable 'connection_failed' from source: set_fact 23826 1726867440.26944: Evaluated conditional (not connection_failed): True 23826 1726867440.27059: variable 'ansible_distribution_major_version' from source: facts 23826 1726867440.27070: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867440.27176: variable 'connection_failed' from source: set_fact 23826 1726867440.27191: Evaluated conditional (not connection_failed): True 23826 1726867440.27310: variable 'ansible_distribution_major_version' from source: facts 23826 1726867440.27328: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867440.27431: variable 'connection_failed' from source: set_fact 23826 1726867440.27442: Evaluated conditional (not connection_failed): True 23826 1726867440.27454: variable 'omit' from source: magic vars 23826 1726867440.27536: variable 'omit' from source: magic vars 23826 1726867440.27562: variable 'omit' from source: magic vars 23826 1726867440.27606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867440.27753: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867440.27756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867440.27758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867440.27760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867440.27762: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867440.27764: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867440.27766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867440.27845: Set connection var ansible_timeout to 10 23826 1726867440.27862: Set connection var ansible_shell_executable to /bin/sh 23826 1726867440.27868: Set connection var ansible_connection to ssh 23826 1726867440.27882: Set connection var ansible_pipelining to False 23826 1726867440.27887: Set connection var ansible_shell_type to sh 23826 1726867440.27895: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867440.27925: variable 'ansible_shell_executable' from source: unknown 23826 1726867440.27931: variable 'ansible_connection' from source: unknown 23826 1726867440.27937: variable 'ansible_module_compression' from source: unknown 23826 1726867440.27942: variable 'ansible_shell_type' from source: unknown 23826 1726867440.27947: variable 'ansible_shell_executable' from source: unknown 23826 1726867440.27952: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867440.27957: variable 'ansible_pipelining' from source: unknown 23826 1726867440.27962: variable 'ansible_timeout' from source: unknown 23826 1726867440.27974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867440.28163: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867440.28180: variable 'omit' from source: magic vars 23826 1726867440.28193: starting attempt loop 23826 1726867440.28199: running the handler 23826 1726867440.28220: _low_level_execute_command(): starting 23826 1726867440.28234: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867440.29075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.29101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867440.29122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867440.29146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867440.29229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867440.30941: stdout chunk (state=3): >>>/root <<< 23826 1726867440.31106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867440.31110: stdout chunk (state=3): >>><<< 23826 1726867440.31113: stderr chunk (state=3): >>><<< 23826 1726867440.31186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867440.31198: _low_level_execute_command(): starting 23826 1726867440.31201: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906 `" && echo ansible-tmp-1726867440.3113563-24985-142203697779906="` echo /root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906 `" ) && sleep 0' 23826 1726867440.31825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867440.31841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867440.31859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867440.31881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867440.31901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867440.31918: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867440.31934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.31998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.32043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867440.32060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867440.32084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867440.32156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867440.34126: stdout chunk (state=3): >>>ansible-tmp-1726867440.3113563-24985-142203697779906=/root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906 <<< 23826 1726867440.34255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867440.34295: stderr chunk (state=3): >>><<< 23826 1726867440.34316: stdout chunk (state=3): >>><<< 23826 1726867440.34339: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867440.3113563-24985-142203697779906=/root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867440.34410: variable 'ansible_module_compression' from source: unknown 23826 1726867440.34475: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 23826 1726867440.34553: variable 'ansible_facts' from source: unknown 23826 1726867440.34762: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/AnsiballZ_package_facts.py 23826 1726867440.34930: Sending initial data 23826 1726867440.35057: Sent initial data (162 bytes) 23826 1726867440.35626: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867440.35706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.35760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867440.35776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867440.35812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867440.35892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867440.37504: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867440.37623: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867440.37628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp32iiesx_ /root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/AnsiballZ_package_facts.py <<< 23826 1726867440.37631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/AnsiballZ_package_facts.py" <<< 23826 1726867440.37635: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp32iiesx_" to remote "/root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/AnsiballZ_package_facts.py" <<< 23826 1726867440.39386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867440.39406: stderr chunk (state=3): >>><<< 23826 1726867440.39529: stdout chunk (state=3): >>><<< 23826 1726867440.39532: done transferring module to remote 23826 1726867440.39535: _low_level_execute_command(): starting 23826 1726867440.39542: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/ /root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/AnsiballZ_package_facts.py && sleep 0' 23826 1726867440.40147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867440.40162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867440.40184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867440.40205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867440.40236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867440.40345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867440.40365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867440.40441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867440.42391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867440.42441: stderr chunk (state=3): >>><<< 23826 1726867440.42450: stdout chunk (state=3): >>><<< 23826 1726867440.42469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867440.42555: _low_level_execute_command(): starting 23826 1726867440.42559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/AnsiballZ_package_facts.py && sleep 0' 23826 1726867440.43081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867440.43084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867440.43086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.43089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867440.43091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.43149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867440.43152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867440.43261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867440.87980: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 23826 1726867440.88027: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 23826 1726867440.88042: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 23826 1726867440.88070: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 23826 1726867440.88097: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 23826 1726867440.88124: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 23826 1726867440.88130: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 23826 1726867440.88159: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 23826 1726867440.88166: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 23826 1726867440.89954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867440.89985: stderr chunk (state=3): >>><<< 23826 1726867440.89988: stdout chunk (state=3): >>><<< 23826 1726867440.90029: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867440.91696: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867440.91714: _low_level_execute_command(): starting 23826 1726867440.91719: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867440.3113563-24985-142203697779906/ > /dev/null 2>&1 && sleep 0' 23826 1726867440.92161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867440.92166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.92169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867440.92171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867440.92173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867440.92222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867440.92229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867440.92271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867440.94151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867440.94169: stderr chunk (state=3): >>><<< 23826 1726867440.94172: stdout chunk (state=3): >>><<< 23826 1726867440.94189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867440.94382: handler run complete 23826 1726867440.94752: variable 'ansible_facts' from source: unknown 23826 1726867440.95004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867440.96070: variable 'ansible_facts' from source: unknown 23826 1726867440.96302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867440.96672: attempt loop complete, returning result 23826 1726867440.96682: _execute() done 23826 1726867440.96685: dumping result to json 23826 1726867440.96797: done dumping result, returning 23826 1726867440.96804: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-a92d-a3ea-00000000049b] 23826 1726867440.96811: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000049b 23826 1726867440.98024: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000049b 23826 1726867440.98027: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867440.98103: no more pending results, returning what we have 23826 1726867440.98105: results queue empty 23826 1726867440.98106: checking for any_errors_fatal 23826 1726867440.98109: done checking for any_errors_fatal 23826 1726867440.98110: checking for max_fail_percentage 23826 1726867440.98111: done checking for max_fail_percentage 23826 1726867440.98111: checking to see if all hosts have failed and the running result is not ok 23826 1726867440.98113: done checking to see if all hosts have failed 23826 1726867440.98114: getting the remaining hosts for this loop 23826 1726867440.98115: done getting the remaining hosts for this loop 23826 1726867440.98118: getting the next task for host managed_node2 23826 1726867440.98122: done getting next task for host managed_node2 23826 1726867440.98125: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 23826 1726867440.98126: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867440.98132: getting variables 23826 1726867440.98133: in VariableManager get_vars() 23826 1726867440.98154: Calling all_inventory to load vars for managed_node2 23826 1726867440.98156: Calling groups_inventory to load vars for managed_node2 23826 1726867440.98157: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867440.98163: Calling all_plugins_play to load vars for managed_node2 23826 1726867440.98165: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867440.98166: Calling groups_plugins_play to load vars for managed_node2 23826 1726867440.98883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867440.99755: done with get_vars() 23826 1726867440.99770: done getting variables 23826 1726867440.99815: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:24:00 -0400 (0:00:00.747) 0:00:23.009 ****** 23826 1726867440.99837: entering _queue_task() for managed_node2/debug 23826 1726867441.00051: worker is 1 (out of 1 available) 23826 1726867441.00064: exiting _queue_task() for managed_node2/debug 23826 1726867441.00075: done queuing things up, now waiting for results queue to drain 23826 1726867441.00078: waiting for pending results... 23826 1726867441.00253: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 23826 1726867441.00382: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000068 23826 1726867441.00385: variable 'ansible_search_path' from source: unknown 23826 1726867441.00387: variable 'ansible_search_path' from source: unknown 23826 1726867441.00389: calling self._execute() 23826 1726867441.00442: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.00446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.00582: variable 'omit' from source: magic vars 23826 1726867441.00811: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.00825: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.00929: variable 'connection_failed' from source: set_fact 23826 1726867441.00938: Evaluated conditional (not connection_failed): True 23826 1726867441.01040: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.01049: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.01144: variable 'connection_failed' from source: set_fact 23826 1726867441.01153: Evaluated conditional (not connection_failed): True 23826 1726867441.01162: variable 'omit' from source: magic vars 23826 1726867441.01200: variable 'omit' from source: magic vars 23826 1726867441.01291: variable 'network_provider' from source: set_fact 23826 1726867441.01314: variable 'omit' from source: magic vars 23826 1726867441.01354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867441.01392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867441.01415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867441.01433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867441.01448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867441.01476: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867441.01486: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.01492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.01584: Set connection var ansible_timeout to 10 23826 1726867441.01598: Set connection var ansible_shell_executable to /bin/sh 23826 1726867441.01603: Set connection var ansible_connection to ssh 23826 1726867441.01616: Set connection var ansible_pipelining to False 23826 1726867441.01621: Set connection var ansible_shell_type to sh 23826 1726867441.01629: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867441.01651: variable 'ansible_shell_executable' from source: unknown 23826 1726867441.01658: variable 'ansible_connection' from source: unknown 23826 1726867441.01665: variable 'ansible_module_compression' from source: unknown 23826 1726867441.01671: variable 'ansible_shell_type' from source: unknown 23826 1726867441.01781: variable 'ansible_shell_executable' from source: unknown 23826 1726867441.01784: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.01787: variable 'ansible_pipelining' from source: unknown 23826 1726867441.01788: variable 'ansible_timeout' from source: unknown 23826 1726867441.01790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.01822: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867441.01836: variable 'omit' from source: magic vars 23826 1726867441.01845: starting attempt loop 23826 1726867441.01850: running the handler 23826 1726867441.01894: handler run complete 23826 1726867441.01912: attempt loop complete, returning result 23826 1726867441.01918: _execute() done 23826 1726867441.01923: dumping result to json 23826 1726867441.01930: done dumping result, returning 23826 1726867441.01939: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-a92d-a3ea-000000000068] 23826 1726867441.01946: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000068 ok: [managed_node2] => {} MSG: Using network provider: nm 23826 1726867441.02080: no more pending results, returning what we have 23826 1726867441.02084: results queue empty 23826 1726867441.02085: checking for any_errors_fatal 23826 1726867441.02099: done checking for any_errors_fatal 23826 1726867441.02100: checking for max_fail_percentage 23826 1726867441.02102: done checking for max_fail_percentage 23826 1726867441.02103: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.02104: done checking to see if all hosts have failed 23826 1726867441.02104: getting the remaining hosts for this loop 23826 1726867441.02106: done getting the remaining hosts for this loop 23826 1726867441.02109: getting the next task for host managed_node2 23826 1726867441.02116: done getting next task for host managed_node2 23826 1726867441.02119: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 23826 1726867441.02120: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.02186: getting variables 23826 1726867441.02188: in VariableManager get_vars() 23826 1726867441.02222: Calling all_inventory to load vars for managed_node2 23826 1726867441.02225: Calling groups_inventory to load vars for managed_node2 23826 1726867441.02227: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.02291: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.02295: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.02298: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.02816: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000068 23826 1726867441.02820: WORKER PROCESS EXITING 23826 1726867441.03127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.04065: done with get_vars() 23826 1726867441.04082: done getting variables 23826 1726867441.04121: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:24:01 -0400 (0:00:00.043) 0:00:23.052 ****** 23826 1726867441.04143: entering _queue_task() for managed_node2/fail 23826 1726867441.04336: worker is 1 (out of 1 available) 23826 1726867441.04349: exiting _queue_task() for managed_node2/fail 23826 1726867441.04360: done queuing things up, now waiting for results queue to drain 23826 1726867441.04361: waiting for pending results... 23826 1726867441.04536: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 23826 1726867441.04611: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000069 23826 1726867441.04626: variable 'ansible_search_path' from source: unknown 23826 1726867441.04630: variable 'ansible_search_path' from source: unknown 23826 1726867441.04657: calling self._execute() 23826 1726867441.04723: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.04727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.04737: variable 'omit' from source: magic vars 23826 1726867441.04998: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.05007: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.05085: variable 'connection_failed' from source: set_fact 23826 1726867441.05089: Evaluated conditional (not connection_failed): True 23826 1726867441.05165: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.05168: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.05240: variable 'connection_failed' from source: set_fact 23826 1726867441.05244: Evaluated conditional (not connection_failed): True 23826 1726867441.05323: variable 'network_state' from source: role '' defaults 23826 1726867441.05331: Evaluated conditional (network_state != {}): False 23826 1726867441.05334: when evaluation is False, skipping this task 23826 1726867441.05338: _execute() done 23826 1726867441.05341: dumping result to json 23826 1726867441.05343: done dumping result, returning 23826 1726867441.05352: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-a92d-a3ea-000000000069] 23826 1726867441.05355: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000069 23826 1726867441.05433: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000069 23826 1726867441.05436: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867441.05497: no more pending results, returning what we have 23826 1726867441.05500: results queue empty 23826 1726867441.05501: checking for any_errors_fatal 23826 1726867441.05506: done checking for any_errors_fatal 23826 1726867441.05507: checking for max_fail_percentage 23826 1726867441.05508: done checking for max_fail_percentage 23826 1726867441.05508: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.05509: done checking to see if all hosts have failed 23826 1726867441.05510: getting the remaining hosts for this loop 23826 1726867441.05511: done getting the remaining hosts for this loop 23826 1726867441.05515: getting the next task for host managed_node2 23826 1726867441.05519: done getting next task for host managed_node2 23826 1726867441.05522: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 23826 1726867441.05524: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.05538: getting variables 23826 1726867441.05540: in VariableManager get_vars() 23826 1726867441.05568: Calling all_inventory to load vars for managed_node2 23826 1726867441.05570: Calling groups_inventory to load vars for managed_node2 23826 1726867441.05572: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.05582: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.05585: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.05588: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.09487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.10328: done with get_vars() 23826 1726867441.10342: done getting variables 23826 1726867441.10374: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:24:01 -0400 (0:00:00.062) 0:00:23.115 ****** 23826 1726867441.10395: entering _queue_task() for managed_node2/fail 23826 1726867441.10629: worker is 1 (out of 1 available) 23826 1726867441.10642: exiting _queue_task() for managed_node2/fail 23826 1726867441.10653: done queuing things up, now waiting for results queue to drain 23826 1726867441.10655: waiting for pending results... 23826 1726867441.10829: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 23826 1726867441.10916: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000006a 23826 1726867441.10930: variable 'ansible_search_path' from source: unknown 23826 1726867441.10933: variable 'ansible_search_path' from source: unknown 23826 1726867441.10960: calling self._execute() 23826 1726867441.11035: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.11041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.11049: variable 'omit' from source: magic vars 23826 1726867441.11320: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.11330: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.11413: variable 'connection_failed' from source: set_fact 23826 1726867441.11416: Evaluated conditional (not connection_failed): True 23826 1726867441.11492: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.11495: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.11562: variable 'connection_failed' from source: set_fact 23826 1726867441.11566: Evaluated conditional (not connection_failed): True 23826 1726867441.11644: variable 'network_state' from source: role '' defaults 23826 1726867441.11658: Evaluated conditional (network_state != {}): False 23826 1726867441.11661: when evaluation is False, skipping this task 23826 1726867441.11664: _execute() done 23826 1726867441.11667: dumping result to json 23826 1726867441.11670: done dumping result, returning 23826 1726867441.11672: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-a92d-a3ea-00000000006a] 23826 1726867441.11675: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006a 23826 1726867441.11764: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006a 23826 1726867441.11767: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867441.11816: no more pending results, returning what we have 23826 1726867441.11819: results queue empty 23826 1726867441.11820: checking for any_errors_fatal 23826 1726867441.11832: done checking for any_errors_fatal 23826 1726867441.11833: checking for max_fail_percentage 23826 1726867441.11835: done checking for max_fail_percentage 23826 1726867441.11836: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.11837: done checking to see if all hosts have failed 23826 1726867441.11838: getting the remaining hosts for this loop 23826 1726867441.11839: done getting the remaining hosts for this loop 23826 1726867441.11842: getting the next task for host managed_node2 23826 1726867441.11848: done getting next task for host managed_node2 23826 1726867441.11851: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 23826 1726867441.11852: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.11865: getting variables 23826 1726867441.11867: in VariableManager get_vars() 23826 1726867441.11901: Calling all_inventory to load vars for managed_node2 23826 1726867441.11903: Calling groups_inventory to load vars for managed_node2 23826 1726867441.11905: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.11916: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.11918: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.11920: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.12714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.13592: done with get_vars() 23826 1726867441.13607: done getting variables 23826 1726867441.13646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:24:01 -0400 (0:00:00.032) 0:00:23.148 ****** 23826 1726867441.13667: entering _queue_task() for managed_node2/fail 23826 1726867441.13860: worker is 1 (out of 1 available) 23826 1726867441.13871: exiting _queue_task() for managed_node2/fail 23826 1726867441.13884: done queuing things up, now waiting for results queue to drain 23826 1726867441.13886: waiting for pending results... 23826 1726867441.14064: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 23826 1726867441.14139: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000006b 23826 1726867441.14151: variable 'ansible_search_path' from source: unknown 23826 1726867441.14154: variable 'ansible_search_path' from source: unknown 23826 1726867441.14181: calling self._execute() 23826 1726867441.14257: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.14262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.14270: variable 'omit' from source: magic vars 23826 1726867441.14535: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.14546: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.14622: variable 'connection_failed' from source: set_fact 23826 1726867441.14626: Evaluated conditional (not connection_failed): True 23826 1726867441.14703: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.14709: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.14772: variable 'connection_failed' from source: set_fact 23826 1726867441.14776: Evaluated conditional (not connection_failed): True 23826 1726867441.14893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867441.16337: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867441.16387: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867441.16416: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867441.16441: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867441.16462: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867441.16519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.16540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.16557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.16585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.16596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.16660: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.16671: Evaluated conditional (ansible_distribution_major_version | int > 9): True 23826 1726867441.16747: variable 'ansible_distribution' from source: facts 23826 1726867441.16751: variable '__network_rh_distros' from source: role '' defaults 23826 1726867441.16758: Evaluated conditional (ansible_distribution in __network_rh_distros): True 23826 1726867441.16917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.16936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.16956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.16983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.16994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.17041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.17059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.17076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.17101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.17182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.17185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.17187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.17189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.17210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.17230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.17526: variable 'network_connections' from source: play vars 23826 1726867441.17543: variable 'profile' from source: play vars 23826 1726867441.17616: variable 'profile' from source: play vars 23826 1726867441.17627: variable 'interface' from source: set_fact 23826 1726867441.17701: variable 'interface' from source: set_fact 23826 1726867441.17719: variable 'network_state' from source: role '' defaults 23826 1726867441.17792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867441.17975: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867441.18020: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867441.18083: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867441.18185: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867441.18189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867441.18191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867441.18212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.18246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867441.18279: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 23826 1726867441.18289: when evaluation is False, skipping this task 23826 1726867441.18298: _execute() done 23826 1726867441.18307: dumping result to json 23826 1726867441.18316: done dumping result, returning 23826 1726867441.18329: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-a92d-a3ea-00000000006b] 23826 1726867441.18340: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006b skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 23826 1726867441.18525: no more pending results, returning what we have 23826 1726867441.18529: results queue empty 23826 1726867441.18530: checking for any_errors_fatal 23826 1726867441.18535: done checking for any_errors_fatal 23826 1726867441.18536: checking for max_fail_percentage 23826 1726867441.18538: done checking for max_fail_percentage 23826 1726867441.18539: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.18540: done checking to see if all hosts have failed 23826 1726867441.18540: getting the remaining hosts for this loop 23826 1726867441.18542: done getting the remaining hosts for this loop 23826 1726867441.18546: getting the next task for host managed_node2 23826 1726867441.18552: done getting next task for host managed_node2 23826 1726867441.18556: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 23826 1726867441.18558: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.18571: getting variables 23826 1726867441.18573: in VariableManager get_vars() 23826 1726867441.18614: Calling all_inventory to load vars for managed_node2 23826 1726867441.18618: Calling groups_inventory to load vars for managed_node2 23826 1726867441.18620: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.18630: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.18634: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.18637: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.19411: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006b 23826 1726867441.19414: WORKER PROCESS EXITING 23826 1726867441.20741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.22815: done with get_vars() 23826 1726867441.22838: done getting variables 23826 1726867441.22901: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:24:01 -0400 (0:00:00.092) 0:00:23.240 ****** 23826 1726867441.22940: entering _queue_task() for managed_node2/dnf 23826 1726867441.23256: worker is 1 (out of 1 available) 23826 1726867441.23267: exiting _queue_task() for managed_node2/dnf 23826 1726867441.23281: done queuing things up, now waiting for results queue to drain 23826 1726867441.23283: waiting for pending results... 23826 1726867441.23543: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 23826 1726867441.23655: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000006c 23826 1726867441.23679: variable 'ansible_search_path' from source: unknown 23826 1726867441.23689: variable 'ansible_search_path' from source: unknown 23826 1726867441.23729: calling self._execute() 23826 1726867441.23830: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.23843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.23860: variable 'omit' from source: magic vars 23826 1726867441.24224: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.24241: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.24353: variable 'connection_failed' from source: set_fact 23826 1726867441.24366: Evaluated conditional (not connection_failed): True 23826 1726867441.24481: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.24493: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.24588: variable 'connection_failed' from source: set_fact 23826 1726867441.24599: Evaluated conditional (not connection_failed): True 23826 1726867441.24785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867441.27181: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867441.27246: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867441.27301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867441.27339: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867441.27371: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867441.27451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.27487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.27520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.27566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.27590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.27701: variable 'ansible_distribution' from source: facts 23826 1726867441.27711: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.27734: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 23826 1726867441.27865: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867441.28002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.28035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.28087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.28166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.28169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.28199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.28229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.28260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.28309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.28530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.28533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.28536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.28537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.28559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.28576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.28906: variable 'network_connections' from source: play vars 23826 1726867441.28924: variable 'profile' from source: play vars 23826 1726867441.29002: variable 'profile' from source: play vars 23826 1726867441.29012: variable 'interface' from source: set_fact 23826 1726867441.29091: variable 'interface' from source: set_fact 23826 1726867441.29169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867441.29487: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867441.29563: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867441.29602: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867441.29723: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867441.29861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867441.29864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867441.29867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.29870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867441.29908: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867441.30185: variable 'network_connections' from source: play vars 23826 1726867441.30197: variable 'profile' from source: play vars 23826 1726867441.30262: variable 'profile' from source: play vars 23826 1726867441.30272: variable 'interface' from source: set_fact 23826 1726867441.30339: variable 'interface' from source: set_fact 23826 1726867441.30367: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867441.30376: when evaluation is False, skipping this task 23826 1726867441.30387: _execute() done 23826 1726867441.30424: dumping result to json 23826 1726867441.30427: done dumping result, returning 23826 1726867441.30429: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000006c] 23826 1726867441.30431: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006c skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867441.30575: no more pending results, returning what we have 23826 1726867441.30582: results queue empty 23826 1726867441.30583: checking for any_errors_fatal 23826 1726867441.30591: done checking for any_errors_fatal 23826 1726867441.30592: checking for max_fail_percentage 23826 1726867441.30594: done checking for max_fail_percentage 23826 1726867441.30595: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.30596: done checking to see if all hosts have failed 23826 1726867441.30596: getting the remaining hosts for this loop 23826 1726867441.30598: done getting the remaining hosts for this loop 23826 1726867441.30602: getting the next task for host managed_node2 23826 1726867441.30609: done getting next task for host managed_node2 23826 1726867441.30614: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 23826 1726867441.30616: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.30630: getting variables 23826 1726867441.30632: in VariableManager get_vars() 23826 1726867441.30672: Calling all_inventory to load vars for managed_node2 23826 1726867441.30675: Calling groups_inventory to load vars for managed_node2 23826 1726867441.30680: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.30692: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.30695: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.30698: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.31591: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006c 23826 1726867441.31594: WORKER PROCESS EXITING 23826 1726867441.32472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.33987: done with get_vars() 23826 1726867441.34007: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 23826 1726867441.34081: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:24:01 -0400 (0:00:00.111) 0:00:23.352 ****** 23826 1726867441.34110: entering _queue_task() for managed_node2/yum 23826 1726867441.34380: worker is 1 (out of 1 available) 23826 1726867441.34393: exiting _queue_task() for managed_node2/yum 23826 1726867441.34405: done queuing things up, now waiting for results queue to drain 23826 1726867441.34406: waiting for pending results... 23826 1726867441.34667: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 23826 1726867441.34781: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000006d 23826 1726867441.34804: variable 'ansible_search_path' from source: unknown 23826 1726867441.34812: variable 'ansible_search_path' from source: unknown 23826 1726867441.34850: calling self._execute() 23826 1726867441.34947: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.34961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.34981: variable 'omit' from source: magic vars 23826 1726867441.35352: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.35370: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.35487: variable 'connection_failed' from source: set_fact 23826 1726867441.35499: Evaluated conditional (not connection_failed): True 23826 1726867441.35610: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.35622: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.35765: variable 'connection_failed' from source: set_fact 23826 1726867441.35768: Evaluated conditional (not connection_failed): True 23826 1726867441.35912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867441.38087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867441.38163: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867441.38205: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867441.38242: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867441.38482: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867441.38486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.38488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.38491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.38493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.38495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.38570: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.38593: Evaluated conditional (ansible_distribution_major_version | int < 8): False 23826 1726867441.38602: when evaluation is False, skipping this task 23826 1726867441.38614: _execute() done 23826 1726867441.38620: dumping result to json 23826 1726867441.38628: done dumping result, returning 23826 1726867441.38640: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000006d] 23826 1726867441.38650: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006d 23826 1726867441.38983: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006d 23826 1726867441.38987: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 23826 1726867441.39030: no more pending results, returning what we have 23826 1726867441.39033: results queue empty 23826 1726867441.39034: checking for any_errors_fatal 23826 1726867441.39038: done checking for any_errors_fatal 23826 1726867441.39039: checking for max_fail_percentage 23826 1726867441.39040: done checking for max_fail_percentage 23826 1726867441.39041: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.39042: done checking to see if all hosts have failed 23826 1726867441.39043: getting the remaining hosts for this loop 23826 1726867441.39044: done getting the remaining hosts for this loop 23826 1726867441.39047: getting the next task for host managed_node2 23826 1726867441.39053: done getting next task for host managed_node2 23826 1726867441.39056: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 23826 1726867441.39058: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.39071: getting variables 23826 1726867441.39072: in VariableManager get_vars() 23826 1726867441.39107: Calling all_inventory to load vars for managed_node2 23826 1726867441.39110: Calling groups_inventory to load vars for managed_node2 23826 1726867441.39112: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.39120: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.39123: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.39126: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.40485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.42032: done with get_vars() 23826 1726867441.42054: done getting variables 23826 1726867441.42112: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:24:01 -0400 (0:00:00.080) 0:00:23.432 ****** 23826 1726867441.42146: entering _queue_task() for managed_node2/fail 23826 1726867441.42421: worker is 1 (out of 1 available) 23826 1726867441.42433: exiting _queue_task() for managed_node2/fail 23826 1726867441.42444: done queuing things up, now waiting for results queue to drain 23826 1726867441.42445: waiting for pending results... 23826 1726867441.42723: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 23826 1726867441.42839: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000006e 23826 1726867441.42860: variable 'ansible_search_path' from source: unknown 23826 1726867441.42869: variable 'ansible_search_path' from source: unknown 23826 1726867441.42916: calling self._execute() 23826 1726867441.43011: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.43024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.43038: variable 'omit' from source: magic vars 23826 1726867441.43403: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.43445: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.43533: variable 'connection_failed' from source: set_fact 23826 1726867441.43543: Evaluated conditional (not connection_failed): True 23826 1726867441.43651: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.43882: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.43885: variable 'connection_failed' from source: set_fact 23826 1726867441.43887: Evaluated conditional (not connection_failed): True 23826 1726867441.43889: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867441.44075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867441.46432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867441.46511: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867441.46550: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867441.46589: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867441.46624: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867441.46702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.46740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.46770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.46816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.46842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.46894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.46922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.46959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.47007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.47027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.47078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.47107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.47136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.47186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.47207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.47385: variable 'network_connections' from source: play vars 23826 1726867441.47400: variable 'profile' from source: play vars 23826 1726867441.47470: variable 'profile' from source: play vars 23826 1726867441.47486: variable 'interface' from source: set_fact 23826 1726867441.47547: variable 'interface' from source: set_fact 23826 1726867441.47682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867441.47792: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867441.47836: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867441.47884: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867441.47922: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867441.47967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867441.47997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867441.48032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.48065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867441.48117: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867441.48455: variable 'network_connections' from source: play vars 23826 1726867441.48458: variable 'profile' from source: play vars 23826 1726867441.48460: variable 'profile' from source: play vars 23826 1726867441.48463: variable 'interface' from source: set_fact 23826 1726867441.48506: variable 'interface' from source: set_fact 23826 1726867441.48535: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867441.48542: when evaluation is False, skipping this task 23826 1726867441.48550: _execute() done 23826 1726867441.48558: dumping result to json 23826 1726867441.48674: done dumping result, returning 23826 1726867441.48679: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000006e] 23826 1726867441.48682: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006e 23826 1726867441.48746: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006e 23826 1726867441.48749: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867441.48862: no more pending results, returning what we have 23826 1726867441.48872: results queue empty 23826 1726867441.48873: checking for any_errors_fatal 23826 1726867441.48887: done checking for any_errors_fatal 23826 1726867441.48888: checking for max_fail_percentage 23826 1726867441.48896: done checking for max_fail_percentage 23826 1726867441.48897: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.48898: done checking to see if all hosts have failed 23826 1726867441.48899: getting the remaining hosts for this loop 23826 1726867441.48900: done getting the remaining hosts for this loop 23826 1726867441.48904: getting the next task for host managed_node2 23826 1726867441.48910: done getting next task for host managed_node2 23826 1726867441.48914: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 23826 1726867441.48917: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.48930: getting variables 23826 1726867441.48933: in VariableManager get_vars() 23826 1726867441.48970: Calling all_inventory to load vars for managed_node2 23826 1726867441.48973: Calling groups_inventory to load vars for managed_node2 23826 1726867441.48976: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.49218: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.49222: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.49225: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.50968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.54287: done with get_vars() 23826 1726867441.54310: done getting variables 23826 1726867441.54369: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:24:01 -0400 (0:00:00.122) 0:00:23.555 ****** 23826 1726867441.54401: entering _queue_task() for managed_node2/package 23826 1726867441.54806: worker is 1 (out of 1 available) 23826 1726867441.54818: exiting _queue_task() for managed_node2/package 23826 1726867441.54828: done queuing things up, now waiting for results queue to drain 23826 1726867441.54829: waiting for pending results... 23826 1726867441.55014: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 23826 1726867441.55125: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000006f 23826 1726867441.55164: variable 'ansible_search_path' from source: unknown 23826 1726867441.55167: variable 'ansible_search_path' from source: unknown 23826 1726867441.55199: calling self._execute() 23826 1726867441.55381: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.55385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.55388: variable 'omit' from source: magic vars 23826 1726867441.55691: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.55711: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.55823: variable 'connection_failed' from source: set_fact 23826 1726867441.55837: Evaluated conditional (not connection_failed): True 23826 1726867441.56045: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.56061: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.56482: variable 'connection_failed' from source: set_fact 23826 1726867441.56487: Evaluated conditional (not connection_failed): True 23826 1726867441.56581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867441.57174: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867441.57303: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867441.57495: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867441.57536: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867441.57752: variable 'network_packages' from source: role '' defaults 23826 1726867441.57969: variable '__network_provider_setup' from source: role '' defaults 23826 1726867441.58032: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867441.58105: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867441.58484: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867441.58487: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867441.58697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867441.63674: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867441.63682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867441.63785: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867441.63826: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867441.63869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867441.64084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.64097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.64243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.64290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.64313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.64367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.64460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.64554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.64599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.64668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.65145: variable '__network_packages_default_gobject_packages' from source: role '' defaults 23826 1726867441.65420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.65584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.65587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.65620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.65712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.65922: variable 'ansible_python' from source: facts 23826 1726867441.65952: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 23826 1726867441.66156: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867441.66296: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867441.66588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.66620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.66698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.66751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.66985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.66989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867441.66991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867441.67098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.67144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867441.67162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867441.67448: variable 'network_connections' from source: play vars 23826 1726867441.67491: variable 'profile' from source: play vars 23826 1726867441.67631: variable 'profile' from source: play vars 23826 1726867441.67758: variable 'interface' from source: set_fact 23826 1726867441.67884: variable 'interface' from source: set_fact 23826 1726867441.68022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867441.68109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867441.68219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867441.68255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867441.68337: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867441.69094: variable 'network_connections' from source: play vars 23826 1726867441.69130: variable 'profile' from source: play vars 23826 1726867441.69288: variable 'profile' from source: play vars 23826 1726867441.69448: variable 'interface' from source: set_fact 23826 1726867441.69557: variable 'interface' from source: set_fact 23826 1726867441.69734: variable '__network_packages_default_wireless' from source: role '' defaults 23826 1726867441.69803: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867441.70552: variable 'network_connections' from source: play vars 23826 1726867441.70613: variable 'profile' from source: play vars 23826 1726867441.70732: variable 'profile' from source: play vars 23826 1726867441.70860: variable 'interface' from source: set_fact 23826 1726867441.70993: variable 'interface' from source: set_fact 23826 1726867441.71067: variable '__network_packages_default_team' from source: role '' defaults 23826 1726867441.71293: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867441.72167: variable 'network_connections' from source: play vars 23826 1726867441.72271: variable 'profile' from source: play vars 23826 1726867441.72327: variable 'profile' from source: play vars 23826 1726867441.72387: variable 'interface' from source: set_fact 23826 1726867441.72785: variable 'interface' from source: set_fact 23826 1726867441.72788: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867441.72924: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867441.72937: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867441.73005: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867441.73547: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 23826 1726867441.74937: variable 'network_connections' from source: play vars 23826 1726867441.74949: variable 'profile' from source: play vars 23826 1726867441.75031: variable 'profile' from source: play vars 23826 1726867441.75041: variable 'interface' from source: set_fact 23826 1726867441.75111: variable 'interface' from source: set_fact 23826 1726867441.75134: variable 'ansible_distribution' from source: facts 23826 1726867441.75143: variable '__network_rh_distros' from source: role '' defaults 23826 1726867441.75154: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.75173: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 23826 1726867441.75363: variable 'ansible_distribution' from source: facts 23826 1726867441.75372: variable '__network_rh_distros' from source: role '' defaults 23826 1726867441.75386: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.75403: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 23826 1726867441.75587: variable 'ansible_distribution' from source: facts 23826 1726867441.75680: variable '__network_rh_distros' from source: role '' defaults 23826 1726867441.75683: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.75686: variable 'network_provider' from source: set_fact 23826 1726867441.75688: variable 'ansible_facts' from source: unknown 23826 1726867441.76772: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 23826 1726867441.76775: when evaluation is False, skipping this task 23826 1726867441.76780: _execute() done 23826 1726867441.76783: dumping result to json 23826 1726867441.76784: done dumping result, returning 23826 1726867441.76799: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-a92d-a3ea-00000000006f] 23826 1726867441.76815: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006f skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 23826 1726867441.77236: no more pending results, returning what we have 23826 1726867441.77240: results queue empty 23826 1726867441.77241: checking for any_errors_fatal 23826 1726867441.77248: done checking for any_errors_fatal 23826 1726867441.77249: checking for max_fail_percentage 23826 1726867441.77251: done checking for max_fail_percentage 23826 1726867441.77253: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.77254: done checking to see if all hosts have failed 23826 1726867441.77255: getting the remaining hosts for this loop 23826 1726867441.77256: done getting the remaining hosts for this loop 23826 1726867441.77261: getting the next task for host managed_node2 23826 1726867441.77268: done getting next task for host managed_node2 23826 1726867441.77272: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 23826 1726867441.77275: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.77299: getting variables 23826 1726867441.77301: in VariableManager get_vars() 23826 1726867441.77345: Calling all_inventory to load vars for managed_node2 23826 1726867441.77348: Calling groups_inventory to load vars for managed_node2 23826 1726867441.77351: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.77362: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.77365: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.77368: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.78118: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000006f 23826 1726867441.78122: WORKER PROCESS EXITING 23826 1726867441.79892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.82893: done with get_vars() 23826 1726867441.82923: done getting variables 23826 1726867441.83087: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:24:01 -0400 (0:00:00.287) 0:00:23.842 ****** 23826 1726867441.83122: entering _queue_task() for managed_node2/package 23826 1726867441.84002: worker is 1 (out of 1 available) 23826 1726867441.84017: exiting _queue_task() for managed_node2/package 23826 1726867441.84027: done queuing things up, now waiting for results queue to drain 23826 1726867441.84028: waiting for pending results... 23826 1726867441.84241: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 23826 1726867441.84366: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000070 23826 1726867441.84389: variable 'ansible_search_path' from source: unknown 23826 1726867441.84399: variable 'ansible_search_path' from source: unknown 23826 1726867441.84447: calling self._execute() 23826 1726867441.84554: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.84568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.84589: variable 'omit' from source: magic vars 23826 1726867441.84993: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.85013: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.85133: variable 'connection_failed' from source: set_fact 23826 1726867441.85145: Evaluated conditional (not connection_failed): True 23826 1726867441.85262: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.85273: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.85381: variable 'connection_failed' from source: set_fact 23826 1726867441.85446: Evaluated conditional (not connection_failed): True 23826 1726867441.85685: variable 'network_state' from source: role '' defaults 23826 1726867441.85744: Evaluated conditional (network_state != {}): False 23826 1726867441.86188: when evaluation is False, skipping this task 23826 1726867441.86192: _execute() done 23826 1726867441.86194: dumping result to json 23826 1726867441.86197: done dumping result, returning 23826 1726867441.86199: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-a92d-a3ea-000000000070] 23826 1726867441.86206: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000070 23826 1726867441.86279: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000070 23826 1726867441.86282: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867441.86336: no more pending results, returning what we have 23826 1726867441.86340: results queue empty 23826 1726867441.86341: checking for any_errors_fatal 23826 1726867441.86350: done checking for any_errors_fatal 23826 1726867441.86350: checking for max_fail_percentage 23826 1726867441.86352: done checking for max_fail_percentage 23826 1726867441.86353: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.86354: done checking to see if all hosts have failed 23826 1726867441.86355: getting the remaining hosts for this loop 23826 1726867441.86356: done getting the remaining hosts for this loop 23826 1726867441.86360: getting the next task for host managed_node2 23826 1726867441.86368: done getting next task for host managed_node2 23826 1726867441.86372: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 23826 1726867441.86374: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.86391: getting variables 23826 1726867441.86393: in VariableManager get_vars() 23826 1726867441.86436: Calling all_inventory to load vars for managed_node2 23826 1726867441.86439: Calling groups_inventory to load vars for managed_node2 23826 1726867441.86441: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.86455: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.86458: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.86461: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.89697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.91868: done with get_vars() 23826 1726867441.91892: done getting variables 23826 1726867441.91959: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:24:01 -0400 (0:00:00.088) 0:00:23.931 ****** 23826 1726867441.91993: entering _queue_task() for managed_node2/package 23826 1726867441.92328: worker is 1 (out of 1 available) 23826 1726867441.92342: exiting _queue_task() for managed_node2/package 23826 1726867441.92559: done queuing things up, now waiting for results queue to drain 23826 1726867441.92561: waiting for pending results... 23826 1726867441.93060: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 23826 1726867441.93235: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000071 23826 1726867441.93286: variable 'ansible_search_path' from source: unknown 23826 1726867441.93486: variable 'ansible_search_path' from source: unknown 23826 1726867441.93490: calling self._execute() 23826 1726867441.93631: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867441.93644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867441.93660: variable 'omit' from source: magic vars 23826 1726867441.94460: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.94486: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.94618: variable 'connection_failed' from source: set_fact 23826 1726867441.94629: Evaluated conditional (not connection_failed): True 23826 1726867441.94750: variable 'ansible_distribution_major_version' from source: facts 23826 1726867441.94762: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867441.94872: variable 'connection_failed' from source: set_fact 23826 1726867441.94885: Evaluated conditional (not connection_failed): True 23826 1726867441.95018: variable 'network_state' from source: role '' defaults 23826 1726867441.95035: Evaluated conditional (network_state != {}): False 23826 1726867441.95130: when evaluation is False, skipping this task 23826 1726867441.95134: _execute() done 23826 1726867441.95136: dumping result to json 23826 1726867441.95139: done dumping result, returning 23826 1726867441.95141: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-a92d-a3ea-000000000071] 23826 1726867441.95143: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000071 23826 1726867441.95219: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000071 23826 1726867441.95222: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867441.95285: no more pending results, returning what we have 23826 1726867441.95290: results queue empty 23826 1726867441.95291: checking for any_errors_fatal 23826 1726867441.95298: done checking for any_errors_fatal 23826 1726867441.95299: checking for max_fail_percentage 23826 1726867441.95300: done checking for max_fail_percentage 23826 1726867441.95301: checking to see if all hosts have failed and the running result is not ok 23826 1726867441.95302: done checking to see if all hosts have failed 23826 1726867441.95303: getting the remaining hosts for this loop 23826 1726867441.95304: done getting the remaining hosts for this loop 23826 1726867441.95311: getting the next task for host managed_node2 23826 1726867441.95318: done getting next task for host managed_node2 23826 1726867441.95322: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 23826 1726867441.95324: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867441.95380: getting variables 23826 1726867441.95382: in VariableManager get_vars() 23826 1726867441.95426: Calling all_inventory to load vars for managed_node2 23826 1726867441.95429: Calling groups_inventory to load vars for managed_node2 23826 1726867441.95431: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867441.95443: Calling all_plugins_play to load vars for managed_node2 23826 1726867441.95446: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867441.95564: Calling groups_plugins_play to load vars for managed_node2 23826 1726867441.97286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867441.98939: done with get_vars() 23826 1726867441.98959: done getting variables 23826 1726867441.99024: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:24:01 -0400 (0:00:00.070) 0:00:24.001 ****** 23826 1726867441.99060: entering _queue_task() for managed_node2/service 23826 1726867441.99596: worker is 1 (out of 1 available) 23826 1726867441.99606: exiting _queue_task() for managed_node2/service 23826 1726867441.99617: done queuing things up, now waiting for results queue to drain 23826 1726867441.99619: waiting for pending results... 23826 1726867441.99748: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 23826 1726867441.99818: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000072 23826 1726867441.99844: variable 'ansible_search_path' from source: unknown 23826 1726867441.99856: variable 'ansible_search_path' from source: unknown 23826 1726867441.99956: calling self._execute() 23826 1726867442.00010: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867442.00026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867442.00040: variable 'omit' from source: magic vars 23826 1726867442.00440: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.00458: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867442.00582: variable 'connection_failed' from source: set_fact 23826 1726867442.00599: Evaluated conditional (not connection_failed): True 23826 1726867442.00725: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.00735: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867442.00840: variable 'connection_failed' from source: set_fact 23826 1726867442.00850: Evaluated conditional (not connection_failed): True 23826 1726867442.00983: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867442.01198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867442.03584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867442.03632: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867442.03674: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867442.03721: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867442.03761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867442.03854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867442.03890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867442.03951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.03978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867442.03998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867442.04050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867442.04090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867442.04123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.04170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867442.04195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867442.04278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867442.04282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867442.04304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.04352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867442.04371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867442.04567: variable 'network_connections' from source: play vars 23826 1726867442.04589: variable 'profile' from source: play vars 23826 1726867442.04712: variable 'profile' from source: play vars 23826 1726867442.04716: variable 'interface' from source: set_fact 23826 1726867442.04755: variable 'interface' from source: set_fact 23826 1726867442.04838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867442.05028: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867442.05079: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867442.05144: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867442.05147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867442.05192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867442.05222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867442.05256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.05292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867442.05362: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867442.05624: variable 'network_connections' from source: play vars 23826 1726867442.05634: variable 'profile' from source: play vars 23826 1726867442.05716: variable 'profile' from source: play vars 23826 1726867442.05719: variable 'interface' from source: set_fact 23826 1726867442.05825: variable 'interface' from source: set_fact 23826 1726867442.05832: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867442.05838: when evaluation is False, skipping this task 23826 1726867442.05845: _execute() done 23826 1726867442.05850: dumping result to json 23826 1726867442.05881: done dumping result, returning 23826 1726867442.05885: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-000000000072] 23826 1726867442.05887: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000072 23826 1726867442.06136: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000072 23826 1726867442.06139: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867442.06195: no more pending results, returning what we have 23826 1726867442.06199: results queue empty 23826 1726867442.06200: checking for any_errors_fatal 23826 1726867442.06210: done checking for any_errors_fatal 23826 1726867442.06211: checking for max_fail_percentage 23826 1726867442.06213: done checking for max_fail_percentage 23826 1726867442.06214: checking to see if all hosts have failed and the running result is not ok 23826 1726867442.06215: done checking to see if all hosts have failed 23826 1726867442.06215: getting the remaining hosts for this loop 23826 1726867442.06217: done getting the remaining hosts for this loop 23826 1726867442.06220: getting the next task for host managed_node2 23826 1726867442.06226: done getting next task for host managed_node2 23826 1726867442.06230: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 23826 1726867442.06232: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867442.06245: getting variables 23826 1726867442.06252: in VariableManager get_vars() 23826 1726867442.06295: Calling all_inventory to load vars for managed_node2 23826 1726867442.06298: Calling groups_inventory to load vars for managed_node2 23826 1726867442.06300: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867442.06314: Calling all_plugins_play to load vars for managed_node2 23826 1726867442.06317: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867442.06320: Calling groups_plugins_play to load vars for managed_node2 23826 1726867442.07889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867442.09601: done with get_vars() 23826 1726867442.09629: done getting variables 23826 1726867442.09694: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:24:02 -0400 (0:00:00.106) 0:00:24.108 ****** 23826 1726867442.09733: entering _queue_task() for managed_node2/service 23826 1726867442.10095: worker is 1 (out of 1 available) 23826 1726867442.10109: exiting _queue_task() for managed_node2/service 23826 1726867442.10121: done queuing things up, now waiting for results queue to drain 23826 1726867442.10122: waiting for pending results... 23826 1726867442.10485: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 23826 1726867442.10551: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000073 23826 1726867442.10582: variable 'ansible_search_path' from source: unknown 23826 1726867442.10592: variable 'ansible_search_path' from source: unknown 23826 1726867442.10635: calling self._execute() 23826 1726867442.10783: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867442.10794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867442.10797: variable 'omit' from source: magic vars 23826 1726867442.11149: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.11165: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867442.11286: variable 'connection_failed' from source: set_fact 23826 1726867442.11296: Evaluated conditional (not connection_failed): True 23826 1726867442.11420: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.11431: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867442.11555: variable 'connection_failed' from source: set_fact 23826 1726867442.11558: Evaluated conditional (not connection_failed): True 23826 1726867442.11710: variable 'network_provider' from source: set_fact 23826 1726867442.11773: variable 'network_state' from source: role '' defaults 23826 1726867442.11778: Evaluated conditional (network_provider == "nm" or network_state != {}): True 23826 1726867442.11781: variable 'omit' from source: magic vars 23826 1726867442.11792: variable 'omit' from source: magic vars 23826 1726867442.11828: variable 'network_service_name' from source: role '' defaults 23826 1726867442.11911: variable 'network_service_name' from source: role '' defaults 23826 1726867442.12025: variable '__network_provider_setup' from source: role '' defaults 23826 1726867442.12035: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867442.12104: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867442.12122: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867442.12187: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867442.12482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867442.14954: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867442.15041: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867442.15082: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867442.15123: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867442.15159: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867442.15245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867442.15282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867442.15317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.15459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867442.15462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867442.15464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867442.15466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867442.15493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.15538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867442.15556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867442.15791: variable '__network_packages_default_gobject_packages' from source: role '' defaults 23826 1726867442.15922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867442.15951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867442.15980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.16030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867442.16048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867442.16148: variable 'ansible_python' from source: facts 23826 1726867442.16173: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 23826 1726867442.16268: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867442.16383: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867442.16485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867442.16516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867442.16544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.16596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867442.16671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867442.16682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867442.16711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867442.16740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.16882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867442.16887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867442.16953: variable 'network_connections' from source: play vars 23826 1726867442.16965: variable 'profile' from source: play vars 23826 1726867442.17052: variable 'profile' from source: play vars 23826 1726867442.17063: variable 'interface' from source: set_fact 23826 1726867442.17135: variable 'interface' from source: set_fact 23826 1726867442.17250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867442.17467: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867442.17524: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867442.17585: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867442.17631: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867442.17705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867442.17743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867442.17875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867442.17882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867442.17885: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867442.18191: variable 'network_connections' from source: play vars 23826 1726867442.18203: variable 'profile' from source: play vars 23826 1726867442.18288: variable 'profile' from source: play vars 23826 1726867442.18299: variable 'interface' from source: set_fact 23826 1726867442.18367: variable 'interface' from source: set_fact 23826 1726867442.18410: variable '__network_packages_default_wireless' from source: role '' defaults 23826 1726867442.18501: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867442.18872: variable 'network_connections' from source: play vars 23826 1726867442.18875: variable 'profile' from source: play vars 23826 1726867442.18910: variable 'profile' from source: play vars 23826 1726867442.18921: variable 'interface' from source: set_fact 23826 1726867442.19005: variable 'interface' from source: set_fact 23826 1726867442.19038: variable '__network_packages_default_team' from source: role '' defaults 23826 1726867442.19131: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867442.19451: variable 'network_connections' from source: play vars 23826 1726867442.19584: variable 'profile' from source: play vars 23826 1726867442.19587: variable 'profile' from source: play vars 23826 1726867442.19589: variable 'interface' from source: set_fact 23826 1726867442.19628: variable 'interface' from source: set_fact 23826 1726867442.19690: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867442.19762: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867442.19774: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867442.19843: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867442.20216: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 23826 1726867442.20731: variable 'network_connections' from source: play vars 23826 1726867442.20742: variable 'profile' from source: play vars 23826 1726867442.20816: variable 'profile' from source: play vars 23826 1726867442.20825: variable 'interface' from source: set_fact 23826 1726867442.20910: variable 'interface' from source: set_fact 23826 1726867442.20984: variable 'ansible_distribution' from source: facts 23826 1726867442.20987: variable '__network_rh_distros' from source: role '' defaults 23826 1726867442.20989: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.20991: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 23826 1726867442.21142: variable 'ansible_distribution' from source: facts 23826 1726867442.21151: variable '__network_rh_distros' from source: role '' defaults 23826 1726867442.21161: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.21180: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 23826 1726867442.21391: variable 'ansible_distribution' from source: facts 23826 1726867442.21399: variable '__network_rh_distros' from source: role '' defaults 23826 1726867442.21411: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.21456: variable 'network_provider' from source: set_fact 23826 1726867442.21556: variable 'omit' from source: magic vars 23826 1726867442.21559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867442.21562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867442.21572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867442.21597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867442.21615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867442.21646: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867442.21655: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867442.21671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867442.21781: Set connection var ansible_timeout to 10 23826 1726867442.21795: Set connection var ansible_shell_executable to /bin/sh 23826 1726867442.21802: Set connection var ansible_connection to ssh 23826 1726867442.21817: Set connection var ansible_pipelining to False 23826 1726867442.21823: Set connection var ansible_shell_type to sh 23826 1726867442.21831: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867442.21865: variable 'ansible_shell_executable' from source: unknown 23826 1726867442.21883: variable 'ansible_connection' from source: unknown 23826 1726867442.21886: variable 'ansible_module_compression' from source: unknown 23826 1726867442.21982: variable 'ansible_shell_type' from source: unknown 23826 1726867442.21987: variable 'ansible_shell_executable' from source: unknown 23826 1726867442.21989: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867442.21991: variable 'ansible_pipelining' from source: unknown 23826 1726867442.21994: variable 'ansible_timeout' from source: unknown 23826 1726867442.21996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867442.22116: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867442.22119: variable 'omit' from source: magic vars 23826 1726867442.22122: starting attempt loop 23826 1726867442.22124: running the handler 23826 1726867442.22154: variable 'ansible_facts' from source: unknown 23826 1726867442.22926: _low_level_execute_command(): starting 23826 1726867442.22937: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867442.23676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867442.23697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867442.23756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867442.23825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867442.23842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867442.24014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867442.24074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867442.25753: stdout chunk (state=3): >>>/root <<< 23826 1726867442.25906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867442.25914: stdout chunk (state=3): >>><<< 23826 1726867442.25917: stderr chunk (state=3): >>><<< 23826 1726867442.26038: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867442.26042: _low_level_execute_command(): starting 23826 1726867442.26045: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987 `" && echo ansible-tmp-1726867442.2593856-25066-120961862524987="` echo /root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987 `" ) && sleep 0' 23826 1726867442.27185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867442.27189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867442.27191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867442.27193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867442.27195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867442.27197: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867442.27199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867442.27201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867442.27203: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867442.27205: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867442.27207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867442.27212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867442.27214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867442.27354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867442.27494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867442.27644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867442.29600: stdout chunk (state=3): >>>ansible-tmp-1726867442.2593856-25066-120961862524987=/root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987 <<< 23826 1726867442.29709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867442.29753: stderr chunk (state=3): >>><<< 23826 1726867442.29792: stdout chunk (state=3): >>><<< 23826 1726867442.29816: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867442.2593856-25066-120961862524987=/root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867442.29848: variable 'ansible_module_compression' from source: unknown 23826 1726867442.29899: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 23826 1726867442.29960: variable 'ansible_facts' from source: unknown 23826 1726867442.30571: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/AnsiballZ_systemd.py 23826 1726867442.30907: Sending initial data 23826 1726867442.30910: Sent initial data (156 bytes) 23826 1726867442.31821: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867442.32086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867442.32093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867442.32164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867442.33795: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 23826 1726867442.33799: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867442.33841: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867442.33964: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp_m6t_f3i /root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/AnsiballZ_systemd.py <<< 23826 1726867442.33968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/AnsiballZ_systemd.py" <<< 23826 1726867442.34004: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp_m6t_f3i" to remote "/root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/AnsiballZ_systemd.py" <<< 23826 1726867442.34008: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/AnsiballZ_systemd.py" <<< 23826 1726867442.36639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867442.36703: stderr chunk (state=3): >>><<< 23826 1726867442.36706: stdout chunk (state=3): >>><<< 23826 1726867442.36774: done transferring module to remote 23826 1726867442.36780: _low_level_execute_command(): starting 23826 1726867442.36782: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/ /root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/AnsiballZ_systemd.py && sleep 0' 23826 1726867442.38116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867442.38120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867442.38123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867442.38126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867442.38128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867442.38226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867442.38249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867442.38264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867442.38390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867442.40290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867442.40333: stderr chunk (state=3): >>><<< 23826 1726867442.40337: stdout chunk (state=3): >>><<< 23826 1726867442.40353: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867442.40356: _low_level_execute_command(): starting 23826 1726867442.40370: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/AnsiballZ_systemd.py && sleep 0' 23826 1726867442.42128: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867442.42131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867442.42294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867442.42302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867442.42305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867442.42310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867442.42381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867442.42387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867442.42745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867442.42837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867442.72786: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4546560", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305766912", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1089603000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 23826 1726867442.75312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867442.75316: stdout chunk (state=3): >>><<< 23826 1726867442.75318: stderr chunk (state=3): >>><<< 23826 1726867442.75321: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4546560", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305766912", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1089603000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867442.75609: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867442.75704: _low_level_execute_command(): starting 23826 1726867442.75926: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867442.2593856-25066-120961862524987/ > /dev/null 2>&1 && sleep 0' 23826 1726867442.77009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867442.77013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867442.77015: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867442.77017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867442.77019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867442.77021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867442.77394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867442.79165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867442.79202: stderr chunk (state=3): >>><<< 23826 1726867442.79204: stdout chunk (state=3): >>><<< 23826 1726867442.79221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867442.79335: handler run complete 23826 1726867442.79337: attempt loop complete, returning result 23826 1726867442.79339: _execute() done 23826 1726867442.79341: dumping result to json 23826 1726867442.79343: done dumping result, returning 23826 1726867442.79345: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-a92d-a3ea-000000000073] 23826 1726867442.79347: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000073 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867442.79955: no more pending results, returning what we have 23826 1726867442.79960: results queue empty 23826 1726867442.79961: checking for any_errors_fatal 23826 1726867442.79969: done checking for any_errors_fatal 23826 1726867442.79970: checking for max_fail_percentage 23826 1726867442.79972: done checking for max_fail_percentage 23826 1726867442.79973: checking to see if all hosts have failed and the running result is not ok 23826 1726867442.79974: done checking to see if all hosts have failed 23826 1726867442.79974: getting the remaining hosts for this loop 23826 1726867442.79976: done getting the remaining hosts for this loop 23826 1726867442.79983: getting the next task for host managed_node2 23826 1726867442.79990: done getting next task for host managed_node2 23826 1726867442.79994: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 23826 1726867442.79997: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867442.80010: getting variables 23826 1726867442.80012: in VariableManager get_vars() 23826 1726867442.80049: Calling all_inventory to load vars for managed_node2 23826 1726867442.80053: Calling groups_inventory to load vars for managed_node2 23826 1726867442.80055: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867442.80067: Calling all_plugins_play to load vars for managed_node2 23826 1726867442.80070: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867442.80073: Calling groups_plugins_play to load vars for managed_node2 23826 1726867442.81784: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000073 23826 1726867442.81787: WORKER PROCESS EXITING 23826 1726867442.83682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867442.87735: done with get_vars() 23826 1726867442.87762: done getting variables 23826 1726867442.87832: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:24:02 -0400 (0:00:00.781) 0:00:24.890 ****** 23826 1726867442.87868: entering _queue_task() for managed_node2/service 23826 1726867442.88615: worker is 1 (out of 1 available) 23826 1726867442.88628: exiting _queue_task() for managed_node2/service 23826 1726867442.88640: done queuing things up, now waiting for results queue to drain 23826 1726867442.88641: waiting for pending results... 23826 1726867442.89382: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 23826 1726867442.89866: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000074 23826 1726867442.89882: variable 'ansible_search_path' from source: unknown 23826 1726867442.89886: variable 'ansible_search_path' from source: unknown 23826 1726867442.89922: calling self._execute() 23826 1726867442.90129: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867442.90133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867442.90146: variable 'omit' from source: magic vars 23826 1726867442.91443: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.91455: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867442.91846: variable 'connection_failed' from source: set_fact 23826 1726867442.91851: Evaluated conditional (not connection_failed): True 23826 1726867442.92072: variable 'ansible_distribution_major_version' from source: facts 23826 1726867442.92076: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867442.92485: variable 'connection_failed' from source: set_fact 23826 1726867442.92488: Evaluated conditional (not connection_failed): True 23826 1726867442.92992: variable 'network_provider' from source: set_fact 23826 1726867442.92996: Evaluated conditional (network_provider == "nm"): True 23826 1726867442.93206: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867442.93512: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867442.94156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867443.00614: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867443.00636: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867443.00681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867443.00864: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867443.00898: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867443.01187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867443.01511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867443.01627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867443.01631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867443.01634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867443.01746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867443.02080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867443.02083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867443.02085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867443.02087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867443.02218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867443.02410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867443.02432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867443.02469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867443.02486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867443.03149: variable 'network_connections' from source: play vars 23826 1726867443.03160: variable 'profile' from source: play vars 23826 1726867443.03343: variable 'profile' from source: play vars 23826 1726867443.03346: variable 'interface' from source: set_fact 23826 1726867443.03559: variable 'interface' from source: set_fact 23826 1726867443.03731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867443.04213: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867443.04246: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867443.04273: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867443.04515: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867443.04518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867443.04520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867443.04522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867443.04637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867443.04733: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867443.05654: variable 'network_connections' from source: play vars 23826 1726867443.05667: variable 'profile' from source: play vars 23826 1726867443.05801: variable 'profile' from source: play vars 23826 1726867443.05804: variable 'interface' from source: set_fact 23826 1726867443.06038: variable 'interface' from source: set_fact 23826 1726867443.06044: Evaluated conditional (__network_wpa_supplicant_required): False 23826 1726867443.06046: when evaluation is False, skipping this task 23826 1726867443.06048: _execute() done 23826 1726867443.06050: dumping result to json 23826 1726867443.06053: done dumping result, returning 23826 1726867443.06055: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-a92d-a3ea-000000000074] 23826 1726867443.06299: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000074 23826 1726867443.06366: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000074 23826 1726867443.06369: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 23826 1726867443.06428: no more pending results, returning what we have 23826 1726867443.06432: results queue empty 23826 1726867443.06433: checking for any_errors_fatal 23826 1726867443.06457: done checking for any_errors_fatal 23826 1726867443.06458: checking for max_fail_percentage 23826 1726867443.06460: done checking for max_fail_percentage 23826 1726867443.06461: checking to see if all hosts have failed and the running result is not ok 23826 1726867443.06462: done checking to see if all hosts have failed 23826 1726867443.06463: getting the remaining hosts for this loop 23826 1726867443.06464: done getting the remaining hosts for this loop 23826 1726867443.06468: getting the next task for host managed_node2 23826 1726867443.06475: done getting next task for host managed_node2 23826 1726867443.06481: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 23826 1726867443.06483: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867443.06498: getting variables 23826 1726867443.06500: in VariableManager get_vars() 23826 1726867443.06543: Calling all_inventory to load vars for managed_node2 23826 1726867443.06546: Calling groups_inventory to load vars for managed_node2 23826 1726867443.06549: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867443.06559: Calling all_plugins_play to load vars for managed_node2 23826 1726867443.06562: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867443.06565: Calling groups_plugins_play to load vars for managed_node2 23826 1726867443.10521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867443.14130: done with get_vars() 23826 1726867443.14157: done getting variables 23826 1726867443.14227: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:24:03 -0400 (0:00:00.263) 0:00:25.153 ****** 23826 1726867443.14258: entering _queue_task() for managed_node2/service 23826 1726867443.14808: worker is 1 (out of 1 available) 23826 1726867443.14818: exiting _queue_task() for managed_node2/service 23826 1726867443.14827: done queuing things up, now waiting for results queue to drain 23826 1726867443.14828: waiting for pending results... 23826 1726867443.14987: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 23826 1726867443.15082: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000075 23826 1726867443.15103: variable 'ansible_search_path' from source: unknown 23826 1726867443.15110: variable 'ansible_search_path' from source: unknown 23826 1726867443.15147: calling self._execute() 23826 1726867443.15387: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867443.15391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867443.15394: variable 'omit' from source: magic vars 23826 1726867443.15684: variable 'ansible_distribution_major_version' from source: facts 23826 1726867443.15707: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867443.15829: variable 'connection_failed' from source: set_fact 23826 1726867443.15839: Evaluated conditional (not connection_failed): True 23826 1726867443.15955: variable 'ansible_distribution_major_version' from source: facts 23826 1726867443.16040: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867443.16074: variable 'connection_failed' from source: set_fact 23826 1726867443.16087: Evaluated conditional (not connection_failed): True 23826 1726867443.16204: variable 'network_provider' from source: set_fact 23826 1726867443.16215: Evaluated conditional (network_provider == "initscripts"): False 23826 1726867443.16224: when evaluation is False, skipping this task 23826 1726867443.16231: _execute() done 23826 1726867443.16238: dumping result to json 23826 1726867443.16244: done dumping result, returning 23826 1726867443.16261: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-a92d-a3ea-000000000075] 23826 1726867443.16269: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000075 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867443.16400: no more pending results, returning what we have 23826 1726867443.16403: results queue empty 23826 1726867443.16404: checking for any_errors_fatal 23826 1726867443.16414: done checking for any_errors_fatal 23826 1726867443.16415: checking for max_fail_percentage 23826 1726867443.16416: done checking for max_fail_percentage 23826 1726867443.16417: checking to see if all hosts have failed and the running result is not ok 23826 1726867443.16418: done checking to see if all hosts have failed 23826 1726867443.16419: getting the remaining hosts for this loop 23826 1726867443.16421: done getting the remaining hosts for this loop 23826 1726867443.16424: getting the next task for host managed_node2 23826 1726867443.16430: done getting next task for host managed_node2 23826 1726867443.16433: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 23826 1726867443.16435: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867443.16450: getting variables 23826 1726867443.16451: in VariableManager get_vars() 23826 1726867443.16510: Calling all_inventory to load vars for managed_node2 23826 1726867443.16513: Calling groups_inventory to load vars for managed_node2 23826 1726867443.16516: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867443.16529: Calling all_plugins_play to load vars for managed_node2 23826 1726867443.16532: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867443.16535: Calling groups_plugins_play to load vars for managed_node2 23826 1726867443.17209: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000075 23826 1726867443.17216: WORKER PROCESS EXITING 23826 1726867443.19482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867443.22711: done with get_vars() 23826 1726867443.22752: done getting variables 23826 1726867443.22822: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:24:03 -0400 (0:00:00.085) 0:00:25.239 ****** 23826 1726867443.22863: entering _queue_task() for managed_node2/copy 23826 1726867443.23267: worker is 1 (out of 1 available) 23826 1726867443.23328: exiting _queue_task() for managed_node2/copy 23826 1726867443.23338: done queuing things up, now waiting for results queue to drain 23826 1726867443.23340: waiting for pending results... 23826 1726867443.23813: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 23826 1726867443.24152: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000076 23826 1726867443.24215: variable 'ansible_search_path' from source: unknown 23826 1726867443.24219: variable 'ansible_search_path' from source: unknown 23826 1726867443.24396: calling self._execute() 23826 1726867443.24725: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867443.24730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867443.24733: variable 'omit' from source: magic vars 23826 1726867443.26147: variable 'ansible_distribution_major_version' from source: facts 23826 1726867443.26150: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867443.26436: variable 'connection_failed' from source: set_fact 23826 1726867443.26488: Evaluated conditional (not connection_failed): True 23826 1726867443.26794: variable 'ansible_distribution_major_version' from source: facts 23826 1726867443.26797: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867443.27127: variable 'connection_failed' from source: set_fact 23826 1726867443.27130: Evaluated conditional (not connection_failed): True 23826 1726867443.27543: variable 'network_provider' from source: set_fact 23826 1726867443.27546: Evaluated conditional (network_provider == "initscripts"): False 23826 1726867443.27552: when evaluation is False, skipping this task 23826 1726867443.27670: _execute() done 23826 1726867443.27673: dumping result to json 23826 1726867443.27676: done dumping result, returning 23826 1726867443.27694: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-a92d-a3ea-000000000076] 23826 1726867443.27702: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000076 23826 1726867443.27938: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000076 23826 1726867443.27941: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 23826 1726867443.27991: no more pending results, returning what we have 23826 1726867443.27995: results queue empty 23826 1726867443.27996: checking for any_errors_fatal 23826 1726867443.28002: done checking for any_errors_fatal 23826 1726867443.28003: checking for max_fail_percentage 23826 1726867443.28005: done checking for max_fail_percentage 23826 1726867443.28006: checking to see if all hosts have failed and the running result is not ok 23826 1726867443.28009: done checking to see if all hosts have failed 23826 1726867443.28009: getting the remaining hosts for this loop 23826 1726867443.28011: done getting the remaining hosts for this loop 23826 1726867443.28015: getting the next task for host managed_node2 23826 1726867443.28020: done getting next task for host managed_node2 23826 1726867443.28024: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 23826 1726867443.28026: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867443.28041: getting variables 23826 1726867443.28042: in VariableManager get_vars() 23826 1726867443.28200: Calling all_inventory to load vars for managed_node2 23826 1726867443.28203: Calling groups_inventory to load vars for managed_node2 23826 1726867443.28205: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867443.28220: Calling all_plugins_play to load vars for managed_node2 23826 1726867443.28223: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867443.28227: Calling groups_plugins_play to load vars for managed_node2 23826 1726867443.30297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867443.32909: done with get_vars() 23826 1726867443.32938: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:24:03 -0400 (0:00:00.103) 0:00:25.343 ****** 23826 1726867443.33269: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 23826 1726867443.33748: worker is 1 (out of 1 available) 23826 1726867443.33760: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 23826 1726867443.33772: done queuing things up, now waiting for results queue to drain 23826 1726867443.33774: waiting for pending results... 23826 1726867443.34093: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 23826 1726867443.34098: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000077 23826 1726867443.34123: variable 'ansible_search_path' from source: unknown 23826 1726867443.34126: variable 'ansible_search_path' from source: unknown 23826 1726867443.34166: calling self._execute() 23826 1726867443.34316: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867443.34320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867443.34323: variable 'omit' from source: magic vars 23826 1726867443.34722: variable 'ansible_distribution_major_version' from source: facts 23826 1726867443.34731: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867443.34839: variable 'connection_failed' from source: set_fact 23826 1726867443.34843: Evaluated conditional (not connection_failed): True 23826 1726867443.34951: variable 'ansible_distribution_major_version' from source: facts 23826 1726867443.34955: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867443.35249: variable 'connection_failed' from source: set_fact 23826 1726867443.35253: Evaluated conditional (not connection_failed): True 23826 1726867443.35256: variable 'omit' from source: magic vars 23826 1726867443.35279: variable 'omit' from source: magic vars 23826 1726867443.35468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867443.38472: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867443.38664: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867443.38673: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867443.38681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867443.38685: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867443.38773: variable 'network_provider' from source: set_fact 23826 1726867443.38942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867443.38972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867443.39008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867443.39053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867443.39082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867443.39246: variable 'omit' from source: magic vars 23826 1726867443.39297: variable 'omit' from source: magic vars 23826 1726867443.39409: variable 'network_connections' from source: play vars 23826 1726867443.39423: variable 'profile' from source: play vars 23826 1726867443.39505: variable 'profile' from source: play vars 23826 1726867443.39508: variable 'interface' from source: set_fact 23826 1726867443.39583: variable 'interface' from source: set_fact 23826 1726867443.39764: variable 'omit' from source: magic vars 23826 1726867443.39775: variable '__lsr_ansible_managed' from source: task vars 23826 1726867443.39837: variable '__lsr_ansible_managed' from source: task vars 23826 1726867443.40146: Loaded config def from plugin (lookup/template) 23826 1726867443.40150: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 23826 1726867443.40226: File lookup term: get_ansible_managed.j2 23826 1726867443.40229: variable 'ansible_search_path' from source: unknown 23826 1726867443.40232: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 23826 1726867443.40236: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 23826 1726867443.40238: variable 'ansible_search_path' from source: unknown 23826 1726867443.47213: variable 'ansible_managed' from source: unknown 23826 1726867443.47383: variable 'omit' from source: magic vars 23826 1726867443.47388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867443.47415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867443.47430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867443.47447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867443.47464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867443.47493: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867443.47497: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867443.47499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867443.47633: Set connection var ansible_timeout to 10 23826 1726867443.47637: Set connection var ansible_shell_executable to /bin/sh 23826 1726867443.47639: Set connection var ansible_connection to ssh 23826 1726867443.47641: Set connection var ansible_pipelining to False 23826 1726867443.47644: Set connection var ansible_shell_type to sh 23826 1726867443.47645: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867443.47671: variable 'ansible_shell_executable' from source: unknown 23826 1726867443.47673: variable 'ansible_connection' from source: unknown 23826 1726867443.47676: variable 'ansible_module_compression' from source: unknown 23826 1726867443.47680: variable 'ansible_shell_type' from source: unknown 23826 1726867443.47682: variable 'ansible_shell_executable' from source: unknown 23826 1726867443.47692: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867443.47694: variable 'ansible_pipelining' from source: unknown 23826 1726867443.47696: variable 'ansible_timeout' from source: unknown 23826 1726867443.47698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867443.47904: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867443.47909: variable 'omit' from source: magic vars 23826 1726867443.47912: starting attempt loop 23826 1726867443.47916: running the handler 23826 1726867443.47919: _low_level_execute_command(): starting 23826 1726867443.47921: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867443.48830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867443.48834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867443.48854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867443.48862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867443.48960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867443.50762: stdout chunk (state=3): >>>/root <<< 23826 1726867443.50825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867443.50831: stdout chunk (state=3): >>><<< 23826 1726867443.50834: stderr chunk (state=3): >>><<< 23826 1726867443.50988: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867443.50993: _low_level_execute_command(): starting 23826 1726867443.50999: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674 `" && echo ansible-tmp-1726867443.508429-25111-208730848045674="` echo /root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674 `" ) && sleep 0' 23826 1726867443.51480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867443.51493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867443.51504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867443.51519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867443.51532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867443.51539: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867443.51549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867443.51616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867443.51619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867443.51621: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867443.51624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867443.51626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867443.51720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867443.51723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867443.51746: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867443.51794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867443.51912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867443.53841: stdout chunk (state=3): >>>ansible-tmp-1726867443.508429-25111-208730848045674=/root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674 <<< 23826 1726867443.54041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867443.54044: stdout chunk (state=3): >>><<< 23826 1726867443.54046: stderr chunk (state=3): >>><<< 23826 1726867443.54296: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867443.508429-25111-208730848045674=/root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867443.54303: variable 'ansible_module_compression' from source: unknown 23826 1726867443.54305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 23826 1726867443.54307: variable 'ansible_facts' from source: unknown 23826 1726867443.54526: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/AnsiballZ_network_connections.py 23826 1726867443.54899: Sending initial data 23826 1726867443.54905: Sent initial data (167 bytes) 23826 1726867443.55641: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867443.55644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867443.55646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867443.55757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867443.55761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867443.55804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867443.55893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867443.55999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867443.57605: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867443.57643: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867443.57703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp31yiliso /root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/AnsiballZ_network_connections.py <<< 23826 1726867443.57709: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/AnsiballZ_network_connections.py" <<< 23826 1726867443.57751: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp31yiliso" to remote "/root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/AnsiballZ_network_connections.py" <<< 23826 1726867443.59082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867443.59093: stderr chunk (state=3): >>><<< 23826 1726867443.59109: stdout chunk (state=3): >>><<< 23826 1726867443.59143: done transferring module to remote 23826 1726867443.59160: _low_level_execute_command(): starting 23826 1726867443.59176: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/ /root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/AnsiballZ_network_connections.py && sleep 0' 23826 1726867443.60025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867443.60102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867443.60176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867443.60233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867443.60295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867443.62369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867443.62372: stdout chunk (state=3): >>><<< 23826 1726867443.62374: stderr chunk (state=3): >>><<< 23826 1726867443.62520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867443.62524: _low_level_execute_command(): starting 23826 1726867443.62526: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/AnsiballZ_network_connections.py && sleep 0' 23826 1726867443.63464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867443.63580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867443.63616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867443.63717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867443.63842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867443.90670: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 23826 1726867443.92564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867443.92568: stdout chunk (state=3): >>><<< 23826 1726867443.92570: stderr chunk (state=3): >>><<< 23826 1726867443.92711: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867443.92715: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867443.92718: _low_level_execute_command(): starting 23826 1726867443.92720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867443.508429-25111-208730848045674/ > /dev/null 2>&1 && sleep 0' 23826 1726867443.93720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867443.93737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867443.93751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867443.93833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867443.93945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867443.94023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867443.94198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867443.94230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867443.96063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867443.96127: stderr chunk (state=3): >>><<< 23826 1726867443.96130: stdout chunk (state=3): >>><<< 23826 1726867443.96185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867443.96189: handler run complete 23826 1726867443.96202: attempt loop complete, returning result 23826 1726867443.96210: _execute() done 23826 1726867443.96217: dumping result to json 23826 1726867443.96242: done dumping result, returning 23826 1726867443.96293: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-a92d-a3ea-000000000077] 23826 1726867443.96297: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000077 ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: 23826 1726867443.96542: no more pending results, returning what we have 23826 1726867443.96546: results queue empty 23826 1726867443.96547: checking for any_errors_fatal 23826 1726867443.96554: done checking for any_errors_fatal 23826 1726867443.96555: checking for max_fail_percentage 23826 1726867443.96557: done checking for max_fail_percentage 23826 1726867443.96558: checking to see if all hosts have failed and the running result is not ok 23826 1726867443.96559: done checking to see if all hosts have failed 23826 1726867443.96560: getting the remaining hosts for this loop 23826 1726867443.96561: done getting the remaining hosts for this loop 23826 1726867443.96565: getting the next task for host managed_node2 23826 1726867443.96572: done getting next task for host managed_node2 23826 1726867443.96576: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 23826 1726867443.96793: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867443.96816: getting variables 23826 1726867443.96818: in VariableManager get_vars() 23826 1726867443.96858: Calling all_inventory to load vars for managed_node2 23826 1726867443.96861: Calling groups_inventory to load vars for managed_node2 23826 1726867443.96864: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867443.96874: Calling all_plugins_play to load vars for managed_node2 23826 1726867443.96925: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867443.96932: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000077 23826 1726867443.96935: WORKER PROCESS EXITING 23826 1726867443.96939: Calling groups_plugins_play to load vars for managed_node2 23826 1726867443.98997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.00589: done with get_vars() 23826 1726867444.00616: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:24:04 -0400 (0:00:00.674) 0:00:26.018 ****** 23826 1726867444.00676: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 23826 1726867444.00941: worker is 1 (out of 1 available) 23826 1726867444.00951: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 23826 1726867444.00963: done queuing things up, now waiting for results queue to drain 23826 1726867444.00964: waiting for pending results... 23826 1726867444.01147: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 23826 1726867444.01222: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000078 23826 1726867444.01234: variable 'ansible_search_path' from source: unknown 23826 1726867444.01237: variable 'ansible_search_path' from source: unknown 23826 1726867444.01267: calling self._execute() 23826 1726867444.01341: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.01346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.01356: variable 'omit' from source: magic vars 23826 1726867444.01635: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.01645: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.01722: variable 'connection_failed' from source: set_fact 23826 1726867444.01725: Evaluated conditional (not connection_failed): True 23826 1726867444.01805: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.01811: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.01876: variable 'connection_failed' from source: set_fact 23826 1726867444.01881: Evaluated conditional (not connection_failed): True 23826 1726867444.01960: variable 'network_state' from source: role '' defaults 23826 1726867444.01972: Evaluated conditional (network_state != {}): False 23826 1726867444.01975: when evaluation is False, skipping this task 23826 1726867444.01980: _execute() done 23826 1726867444.01982: dumping result to json 23826 1726867444.01984: done dumping result, returning 23826 1726867444.02001: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-a92d-a3ea-000000000078] 23826 1726867444.02004: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000078 23826 1726867444.02127: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000078 23826 1726867444.02131: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867444.02188: no more pending results, returning what we have 23826 1726867444.02192: results queue empty 23826 1726867444.02193: checking for any_errors_fatal 23826 1726867444.02210: done checking for any_errors_fatal 23826 1726867444.02211: checking for max_fail_percentage 23826 1726867444.02213: done checking for max_fail_percentage 23826 1726867444.02214: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.02215: done checking to see if all hosts have failed 23826 1726867444.02218: getting the remaining hosts for this loop 23826 1726867444.02219: done getting the remaining hosts for this loop 23826 1726867444.02223: getting the next task for host managed_node2 23826 1726867444.02231: done getting next task for host managed_node2 23826 1726867444.02234: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 23826 1726867444.02237: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.02252: getting variables 23826 1726867444.02254: in VariableManager get_vars() 23826 1726867444.02509: Calling all_inventory to load vars for managed_node2 23826 1726867444.02512: Calling groups_inventory to load vars for managed_node2 23826 1726867444.02515: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.02523: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.02525: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.02528: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.03655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.04542: done with get_vars() 23826 1726867444.04557: done getting variables 23826 1726867444.04598: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:24:04 -0400 (0:00:00.039) 0:00:26.057 ****** 23826 1726867444.04620: entering _queue_task() for managed_node2/debug 23826 1726867444.04821: worker is 1 (out of 1 available) 23826 1726867444.04834: exiting _queue_task() for managed_node2/debug 23826 1726867444.04845: done queuing things up, now waiting for results queue to drain 23826 1726867444.04846: waiting for pending results... 23826 1726867444.05039: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 23826 1726867444.05284: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000079 23826 1726867444.05287: variable 'ansible_search_path' from source: unknown 23826 1726867444.05290: variable 'ansible_search_path' from source: unknown 23826 1726867444.05292: calling self._execute() 23826 1726867444.05302: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.05310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.05321: variable 'omit' from source: magic vars 23826 1726867444.05681: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.05694: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.05811: variable 'connection_failed' from source: set_fact 23826 1726867444.05815: Evaluated conditional (not connection_failed): True 23826 1726867444.05919: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.05922: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.06019: variable 'connection_failed' from source: set_fact 23826 1726867444.06022: Evaluated conditional (not connection_failed): True 23826 1726867444.06029: variable 'omit' from source: magic vars 23826 1726867444.06071: variable 'omit' from source: magic vars 23826 1726867444.06105: variable 'omit' from source: magic vars 23826 1726867444.06145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867444.06180: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867444.06200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867444.06217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867444.06230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867444.06258: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867444.06261: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.06264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.06403: Set connection var ansible_timeout to 10 23826 1726867444.06406: Set connection var ansible_shell_executable to /bin/sh 23826 1726867444.06412: Set connection var ansible_connection to ssh 23826 1726867444.06414: Set connection var ansible_pipelining to False 23826 1726867444.06416: Set connection var ansible_shell_type to sh 23826 1726867444.06419: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867444.06421: variable 'ansible_shell_executable' from source: unknown 23826 1726867444.06423: variable 'ansible_connection' from source: unknown 23826 1726867444.06425: variable 'ansible_module_compression' from source: unknown 23826 1726867444.06427: variable 'ansible_shell_type' from source: unknown 23826 1726867444.06429: variable 'ansible_shell_executable' from source: unknown 23826 1726867444.06431: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.06433: variable 'ansible_pipelining' from source: unknown 23826 1726867444.06435: variable 'ansible_timeout' from source: unknown 23826 1726867444.06437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.06631: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867444.06634: variable 'omit' from source: magic vars 23826 1726867444.06637: starting attempt loop 23826 1726867444.06639: running the handler 23826 1726867444.06729: variable '__network_connections_result' from source: set_fact 23826 1726867444.06780: handler run complete 23826 1726867444.06803: attempt loop complete, returning result 23826 1726867444.06809: _execute() done 23826 1726867444.06812: dumping result to json 23826 1726867444.06816: done dumping result, returning 23826 1726867444.06818: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-a92d-a3ea-000000000079] 23826 1726867444.06834: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000079 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 23826 1726867444.07006: no more pending results, returning what we have 23826 1726867444.07011: results queue empty 23826 1726867444.07012: checking for any_errors_fatal 23826 1726867444.07016: done checking for any_errors_fatal 23826 1726867444.07017: checking for max_fail_percentage 23826 1726867444.07018: done checking for max_fail_percentage 23826 1726867444.07019: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.07020: done checking to see if all hosts have failed 23826 1726867444.07021: getting the remaining hosts for this loop 23826 1726867444.07023: done getting the remaining hosts for this loop 23826 1726867444.07025: getting the next task for host managed_node2 23826 1726867444.07030: done getting next task for host managed_node2 23826 1726867444.07034: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 23826 1726867444.07036: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.07043: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000079 23826 1726867444.07046: WORKER PROCESS EXITING 23826 1726867444.07052: getting variables 23826 1726867444.07053: in VariableManager get_vars() 23826 1726867444.07085: Calling all_inventory to load vars for managed_node2 23826 1726867444.07088: Calling groups_inventory to load vars for managed_node2 23826 1726867444.07098: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.07106: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.07111: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.07114: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.07882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.09162: done with get_vars() 23826 1726867444.09184: done getting variables 23826 1726867444.09245: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:24:04 -0400 (0:00:00.046) 0:00:26.104 ****** 23826 1726867444.09274: entering _queue_task() for managed_node2/debug 23826 1726867444.09538: worker is 1 (out of 1 available) 23826 1726867444.09549: exiting _queue_task() for managed_node2/debug 23826 1726867444.09564: done queuing things up, now waiting for results queue to drain 23826 1726867444.09566: waiting for pending results... 23826 1726867444.09830: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 23826 1726867444.09913: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000007a 23826 1726867444.09927: variable 'ansible_search_path' from source: unknown 23826 1726867444.09931: variable 'ansible_search_path' from source: unknown 23826 1726867444.09965: calling self._execute() 23826 1726867444.10046: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.10051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.10061: variable 'omit' from source: magic vars 23826 1726867444.10437: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.10440: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.10590: variable 'connection_failed' from source: set_fact 23826 1726867444.10593: Evaluated conditional (not connection_failed): True 23826 1726867444.10622: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.10628: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.10722: variable 'connection_failed' from source: set_fact 23826 1726867444.10725: Evaluated conditional (not connection_failed): True 23826 1726867444.10732: variable 'omit' from source: magic vars 23826 1726867444.10766: variable 'omit' from source: magic vars 23826 1726867444.10803: variable 'omit' from source: magic vars 23826 1726867444.10842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867444.10875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867444.10904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867444.10923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867444.10927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867444.10951: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867444.10955: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.10957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.11123: Set connection var ansible_timeout to 10 23826 1726867444.11126: Set connection var ansible_shell_executable to /bin/sh 23826 1726867444.11129: Set connection var ansible_connection to ssh 23826 1726867444.11131: Set connection var ansible_pipelining to False 23826 1726867444.11135: Set connection var ansible_shell_type to sh 23826 1726867444.11137: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867444.11139: variable 'ansible_shell_executable' from source: unknown 23826 1726867444.11141: variable 'ansible_connection' from source: unknown 23826 1726867444.11143: variable 'ansible_module_compression' from source: unknown 23826 1726867444.11145: variable 'ansible_shell_type' from source: unknown 23826 1726867444.11147: variable 'ansible_shell_executable' from source: unknown 23826 1726867444.11149: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.11151: variable 'ansible_pipelining' from source: unknown 23826 1726867444.11153: variable 'ansible_timeout' from source: unknown 23826 1726867444.11155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.11353: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867444.11357: variable 'omit' from source: magic vars 23826 1726867444.11359: starting attempt loop 23826 1726867444.11361: running the handler 23826 1726867444.11363: variable '__network_connections_result' from source: set_fact 23826 1726867444.11393: variable '__network_connections_result' from source: set_fact 23826 1726867444.11473: handler run complete 23826 1726867444.11511: attempt loop complete, returning result 23826 1726867444.11515: _execute() done 23826 1726867444.11517: dumping result to json 23826 1726867444.11519: done dumping result, returning 23826 1726867444.11522: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-a92d-a3ea-00000000007a] 23826 1726867444.11525: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000007a ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 23826 1726867444.11705: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000007a 23826 1726867444.11714: WORKER PROCESS EXITING 23826 1726867444.11724: no more pending results, returning what we have 23826 1726867444.11726: results queue empty 23826 1726867444.11727: checking for any_errors_fatal 23826 1726867444.11733: done checking for any_errors_fatal 23826 1726867444.11734: checking for max_fail_percentage 23826 1726867444.11735: done checking for max_fail_percentage 23826 1726867444.11736: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.11737: done checking to see if all hosts have failed 23826 1726867444.11737: getting the remaining hosts for this loop 23826 1726867444.11739: done getting the remaining hosts for this loop 23826 1726867444.11741: getting the next task for host managed_node2 23826 1726867444.11745: done getting next task for host managed_node2 23826 1726867444.11751: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 23826 1726867444.11753: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.11764: getting variables 23826 1726867444.11766: in VariableManager get_vars() 23826 1726867444.11839: Calling all_inventory to load vars for managed_node2 23826 1726867444.11841: Calling groups_inventory to load vars for managed_node2 23826 1726867444.11844: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.11852: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.11854: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.11856: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.12710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.14924: done with get_vars() 23826 1726867444.14949: done getting variables 23826 1726867444.15015: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:24:04 -0400 (0:00:00.057) 0:00:26.161 ****** 23826 1726867444.15049: entering _queue_task() for managed_node2/debug 23826 1726867444.15330: worker is 1 (out of 1 available) 23826 1726867444.15346: exiting _queue_task() for managed_node2/debug 23826 1726867444.15358: done queuing things up, now waiting for results queue to drain 23826 1726867444.15359: waiting for pending results... 23826 1726867444.15548: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 23826 1726867444.15697: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000007b 23826 1726867444.15702: variable 'ansible_search_path' from source: unknown 23826 1726867444.15704: variable 'ansible_search_path' from source: unknown 23826 1726867444.15751: calling self._execute() 23826 1726867444.15845: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.15851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.15864: variable 'omit' from source: magic vars 23826 1726867444.16322: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.16325: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.16448: variable 'connection_failed' from source: set_fact 23826 1726867444.16451: Evaluated conditional (not connection_failed): True 23826 1726867444.16536: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.16540: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.16609: variable 'connection_failed' from source: set_fact 23826 1726867444.16613: Evaluated conditional (not connection_failed): True 23826 1726867444.16689: variable 'network_state' from source: role '' defaults 23826 1726867444.16698: Evaluated conditional (network_state != {}): False 23826 1726867444.16701: when evaluation is False, skipping this task 23826 1726867444.16704: _execute() done 23826 1726867444.16706: dumping result to json 23826 1726867444.16716: done dumping result, returning 23826 1726867444.16722: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-a92d-a3ea-00000000007b] 23826 1726867444.16725: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000007b 23826 1726867444.16806: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000007b 23826 1726867444.16809: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 23826 1726867444.16855: no more pending results, returning what we have 23826 1726867444.16858: results queue empty 23826 1726867444.16859: checking for any_errors_fatal 23826 1726867444.16866: done checking for any_errors_fatal 23826 1726867444.16866: checking for max_fail_percentage 23826 1726867444.16868: done checking for max_fail_percentage 23826 1726867444.16869: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.16873: done checking to see if all hosts have failed 23826 1726867444.16874: getting the remaining hosts for this loop 23826 1726867444.16875: done getting the remaining hosts for this loop 23826 1726867444.16880: getting the next task for host managed_node2 23826 1726867444.16886: done getting next task for host managed_node2 23826 1726867444.16890: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 23826 1726867444.16892: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.16903: getting variables 23826 1726867444.16905: in VariableManager get_vars() 23826 1726867444.16935: Calling all_inventory to load vars for managed_node2 23826 1726867444.16937: Calling groups_inventory to load vars for managed_node2 23826 1726867444.16939: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.16947: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.16949: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.16952: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.22396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.23752: done with get_vars() 23826 1726867444.23779: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:24:04 -0400 (0:00:00.088) 0:00:26.249 ****** 23826 1726867444.23858: entering _queue_task() for managed_node2/ping 23826 1726867444.24530: worker is 1 (out of 1 available) 23826 1726867444.24542: exiting _queue_task() for managed_node2/ping 23826 1726867444.24553: done queuing things up, now waiting for results queue to drain 23826 1726867444.24555: waiting for pending results... 23826 1726867444.25085: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 23826 1726867444.25092: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000007c 23826 1726867444.25095: variable 'ansible_search_path' from source: unknown 23826 1726867444.25098: variable 'ansible_search_path' from source: unknown 23826 1726867444.25100: calling self._execute() 23826 1726867444.25102: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.25105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.25384: variable 'omit' from source: magic vars 23826 1726867444.25654: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.25659: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.25662: variable 'connection_failed' from source: set_fact 23826 1726867444.25665: Evaluated conditional (not connection_failed): True 23826 1726867444.25769: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.25772: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.25840: variable 'connection_failed' from source: set_fact 23826 1726867444.25844: Evaluated conditional (not connection_failed): True 23826 1726867444.25850: variable 'omit' from source: magic vars 23826 1726867444.25893: variable 'omit' from source: magic vars 23826 1726867444.25932: variable 'omit' from source: magic vars 23826 1726867444.25969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867444.26010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867444.26031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867444.26048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867444.26061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867444.26090: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867444.26094: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.26096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.26284: Set connection var ansible_timeout to 10 23826 1726867444.26287: Set connection var ansible_shell_executable to /bin/sh 23826 1726867444.26291: Set connection var ansible_connection to ssh 23826 1726867444.26294: Set connection var ansible_pipelining to False 23826 1726867444.26297: Set connection var ansible_shell_type to sh 23826 1726867444.26299: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867444.26302: variable 'ansible_shell_executable' from source: unknown 23826 1726867444.26305: variable 'ansible_connection' from source: unknown 23826 1726867444.26307: variable 'ansible_module_compression' from source: unknown 23826 1726867444.26314: variable 'ansible_shell_type' from source: unknown 23826 1726867444.26318: variable 'ansible_shell_executable' from source: unknown 23826 1726867444.26321: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.26324: variable 'ansible_pipelining' from source: unknown 23826 1726867444.26326: variable 'ansible_timeout' from source: unknown 23826 1726867444.26329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.26554: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867444.26559: variable 'omit' from source: magic vars 23826 1726867444.26561: starting attempt loop 23826 1726867444.26563: running the handler 23826 1726867444.26565: _low_level_execute_command(): starting 23826 1726867444.26568: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867444.27462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867444.27481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867444.27588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867444.27785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.27792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867444.29601: stdout chunk (state=3): >>>/root <<< 23826 1726867444.29604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867444.29607: stdout chunk (state=3): >>><<< 23826 1726867444.29617: stderr chunk (state=3): >>><<< 23826 1726867444.29688: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867444.29704: _low_level_execute_command(): starting 23826 1726867444.29713: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471 `" && echo ansible-tmp-1726867444.2968895-25146-122616145904471="` echo /root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471 `" ) && sleep 0' 23826 1726867444.30463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867444.30466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867444.30476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867444.30482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867444.30485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867444.30487: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867444.30489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867444.30491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867444.30500: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867444.30539: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867444.30592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867444.30683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.30867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867444.32682: stdout chunk (state=3): >>>ansible-tmp-1726867444.2968895-25146-122616145904471=/root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471 <<< 23826 1726867444.32791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867444.32892: stderr chunk (state=3): >>><<< 23826 1726867444.32898: stdout chunk (state=3): >>><<< 23826 1726867444.32924: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867444.2968895-25146-122616145904471=/root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867444.32973: variable 'ansible_module_compression' from source: unknown 23826 1726867444.33015: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 23826 1726867444.33049: variable 'ansible_facts' from source: unknown 23826 1726867444.33310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/AnsiballZ_ping.py 23826 1726867444.33864: Sending initial data 23826 1726867444.33868: Sent initial data (153 bytes) 23826 1726867444.34260: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867444.34269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867444.34468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867444.34472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867444.34482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867444.34486: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867444.34491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867444.34494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867444.34497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867444.34500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.34635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867444.36247: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867444.36291: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867444.36400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp8sszcxwl /root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/AnsiballZ_ping.py <<< 23826 1726867444.36403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/AnsiballZ_ping.py" <<< 23826 1726867444.36432: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp8sszcxwl" to remote "/root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/AnsiballZ_ping.py" <<< 23826 1726867444.37753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867444.37756: stdout chunk (state=3): >>><<< 23826 1726867444.37780: stderr chunk (state=3): >>><<< 23826 1726867444.37829: done transferring module to remote 23826 1726867444.37832: _low_level_execute_command(): starting 23826 1726867444.37836: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/ /root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/AnsiballZ_ping.py && sleep 0' 23826 1726867444.39082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867444.39090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867444.39094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867444.39096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867444.39109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867444.39112: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867444.39114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867444.39116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867444.39192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867444.39268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867444.39400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.39436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867444.41584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867444.41587: stdout chunk (state=3): >>><<< 23826 1726867444.41590: stderr chunk (state=3): >>><<< 23826 1726867444.41592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867444.41594: _low_level_execute_command(): starting 23826 1726867444.41596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/AnsiballZ_ping.py && sleep 0' 23826 1726867444.42670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867444.42725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867444.42980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.42995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867444.58616: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 23826 1726867444.59907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867444.59941: stderr chunk (state=3): >>><<< 23826 1726867444.59949: stdout chunk (state=3): >>><<< 23826 1726867444.60011: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867444.60133: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867444.60137: _low_level_execute_command(): starting 23826 1726867444.60139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867444.2968895-25146-122616145904471/ > /dev/null 2>&1 && sleep 0' 23826 1726867444.61285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867444.61459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867444.61696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.61755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867444.63667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867444.63676: stdout chunk (state=3): >>><<< 23826 1726867444.64083: stderr chunk (state=3): >>><<< 23826 1726867444.64087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867444.64089: handler run complete 23826 1726867444.64092: attempt loop complete, returning result 23826 1726867444.64093: _execute() done 23826 1726867444.64096: dumping result to json 23826 1726867444.64097: done dumping result, returning 23826 1726867444.64099: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-a92d-a3ea-00000000007c] 23826 1726867444.64101: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000007c 23826 1726867444.64171: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000007c 23826 1726867444.64175: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 23826 1726867444.64238: no more pending results, returning what we have 23826 1726867444.64242: results queue empty 23826 1726867444.64243: checking for any_errors_fatal 23826 1726867444.64255: done checking for any_errors_fatal 23826 1726867444.64255: checking for max_fail_percentage 23826 1726867444.64258: done checking for max_fail_percentage 23826 1726867444.64259: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.64260: done checking to see if all hosts have failed 23826 1726867444.64261: getting the remaining hosts for this loop 23826 1726867444.64262: done getting the remaining hosts for this loop 23826 1726867444.64266: getting the next task for host managed_node2 23826 1726867444.64273: done getting next task for host managed_node2 23826 1726867444.64480: ^ task is: TASK: meta (role_complete) 23826 1726867444.64483: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.64494: getting variables 23826 1726867444.64496: in VariableManager get_vars() 23826 1726867444.64530: Calling all_inventory to load vars for managed_node2 23826 1726867444.64532: Calling groups_inventory to load vars for managed_node2 23826 1726867444.64534: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.64543: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.64545: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.64548: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.67140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.70810: done with get_vars() 23826 1726867444.70950: done getting variables 23826 1726867444.71034: done queuing things up, now waiting for results queue to drain 23826 1726867444.71135: results queue empty 23826 1726867444.71137: checking for any_errors_fatal 23826 1726867444.71140: done checking for any_errors_fatal 23826 1726867444.71141: checking for max_fail_percentage 23826 1726867444.71142: done checking for max_fail_percentage 23826 1726867444.71143: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.71144: done checking to see if all hosts have failed 23826 1726867444.71149: getting the remaining hosts for this loop 23826 1726867444.71150: done getting the remaining hosts for this loop 23826 1726867444.71153: getting the next task for host managed_node2 23826 1726867444.71157: done getting next task for host managed_node2 23826 1726867444.71159: ^ task is: TASK: meta (flush_handlers) 23826 1726867444.71160: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.71163: getting variables 23826 1726867444.71164: in VariableManager get_vars() 23826 1726867444.71176: Calling all_inventory to load vars for managed_node2 23826 1726867444.71181: Calling groups_inventory to load vars for managed_node2 23826 1726867444.71183: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.71188: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.71191: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.71193: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.73226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.75057: done with get_vars() 23826 1726867444.75125: done getting variables 23826 1726867444.75191: in VariableManager get_vars() 23826 1726867444.75203: Calling all_inventory to load vars for managed_node2 23826 1726867444.75205: Calling groups_inventory to load vars for managed_node2 23826 1726867444.75207: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.75212: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.75214: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.75217: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.77246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.79160: done with get_vars() 23826 1726867444.79195: done queuing things up, now waiting for results queue to drain 23826 1726867444.79197: results queue empty 23826 1726867444.79198: checking for any_errors_fatal 23826 1726867444.79200: done checking for any_errors_fatal 23826 1726867444.79200: checking for max_fail_percentage 23826 1726867444.79202: done checking for max_fail_percentage 23826 1726867444.79202: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.79203: done checking to see if all hosts have failed 23826 1726867444.79204: getting the remaining hosts for this loop 23826 1726867444.79205: done getting the remaining hosts for this loop 23826 1726867444.79210: getting the next task for host managed_node2 23826 1726867444.79214: done getting next task for host managed_node2 23826 1726867444.79216: ^ task is: TASK: meta (flush_handlers) 23826 1726867444.79217: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.79224: getting variables 23826 1726867444.79225: in VariableManager get_vars() 23826 1726867444.79238: Calling all_inventory to load vars for managed_node2 23826 1726867444.79240: Calling groups_inventory to load vars for managed_node2 23826 1726867444.79242: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.79247: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.79250: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.79253: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.80555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.82232: done with get_vars() 23826 1726867444.82252: done getting variables 23826 1726867444.82305: in VariableManager get_vars() 23826 1726867444.82325: Calling all_inventory to load vars for managed_node2 23826 1726867444.82327: Calling groups_inventory to load vars for managed_node2 23826 1726867444.82330: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.82335: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.82337: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.82340: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.83614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.85404: done with get_vars() 23826 1726867444.85432: done queuing things up, now waiting for results queue to drain 23826 1726867444.85434: results queue empty 23826 1726867444.85435: checking for any_errors_fatal 23826 1726867444.85436: done checking for any_errors_fatal 23826 1726867444.85437: checking for max_fail_percentage 23826 1726867444.85438: done checking for max_fail_percentage 23826 1726867444.85438: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.85439: done checking to see if all hosts have failed 23826 1726867444.85440: getting the remaining hosts for this loop 23826 1726867444.85441: done getting the remaining hosts for this loop 23826 1726867444.85444: getting the next task for host managed_node2 23826 1726867444.85447: done getting next task for host managed_node2 23826 1726867444.85448: ^ task is: None 23826 1726867444.85449: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.85450: done queuing things up, now waiting for results queue to drain 23826 1726867444.85451: results queue empty 23826 1726867444.85452: checking for any_errors_fatal 23826 1726867444.85453: done checking for any_errors_fatal 23826 1726867444.85453: checking for max_fail_percentage 23826 1726867444.85454: done checking for max_fail_percentage 23826 1726867444.85455: checking to see if all hosts have failed and the running result is not ok 23826 1726867444.85455: done checking to see if all hosts have failed 23826 1726867444.85457: getting the next task for host managed_node2 23826 1726867444.85464: done getting next task for host managed_node2 23826 1726867444.85465: ^ task is: None 23826 1726867444.85466: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.85513: in VariableManager get_vars() 23826 1726867444.85538: done with get_vars() 23826 1726867444.85544: in VariableManager get_vars() 23826 1726867444.85560: done with get_vars() 23826 1726867444.85565: variable 'omit' from source: magic vars 23826 1726867444.85713: variable 'profile' from source: play vars 23826 1726867444.85827: in VariableManager get_vars() 23826 1726867444.85843: done with get_vars() 23826 1726867444.85865: variable 'omit' from source: magic vars 23826 1726867444.85941: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 23826 1726867444.86788: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 23826 1726867444.86812: getting the remaining hosts for this loop 23826 1726867444.86814: done getting the remaining hosts for this loop 23826 1726867444.86818: getting the next task for host managed_node2 23826 1726867444.86821: done getting next task for host managed_node2 23826 1726867444.86823: ^ task is: TASK: Gathering Facts 23826 1726867444.86824: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867444.86826: getting variables 23826 1726867444.86827: in VariableManager get_vars() 23826 1726867444.86840: Calling all_inventory to load vars for managed_node2 23826 1726867444.86843: Calling groups_inventory to load vars for managed_node2 23826 1726867444.86845: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867444.86850: Calling all_plugins_play to load vars for managed_node2 23826 1726867444.86854: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867444.86857: Calling groups_plugins_play to load vars for managed_node2 23826 1726867444.88146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867444.89852: done with get_vars() 23826 1726867444.89883: done getting variables 23826 1726867444.89932: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 17:24:04 -0400 (0:00:00.660) 0:00:26.910 ****** 23826 1726867444.89955: entering _queue_task() for managed_node2/gather_facts 23826 1726867444.90332: worker is 1 (out of 1 available) 23826 1726867444.90342: exiting _queue_task() for managed_node2/gather_facts 23826 1726867444.90356: done queuing things up, now waiting for results queue to drain 23826 1726867444.90357: waiting for pending results... 23826 1726867444.90668: running TaskExecutor() for managed_node2/TASK: Gathering Facts 23826 1726867444.90838: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000521 23826 1726867444.90842: variable 'ansible_search_path' from source: unknown 23826 1726867444.90853: calling self._execute() 23826 1726867444.90963: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.90975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.90992: variable 'omit' from source: magic vars 23826 1726867444.91492: variable 'ansible_distribution_major_version' from source: facts 23826 1726867444.91498: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867444.91502: variable 'omit' from source: magic vars 23826 1726867444.91509: variable 'omit' from source: magic vars 23826 1726867444.91541: variable 'omit' from source: magic vars 23826 1726867444.91586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867444.91638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867444.91666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867444.91693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867444.91784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867444.91788: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867444.91790: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.91792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.91893: Set connection var ansible_timeout to 10 23826 1726867444.91906: Set connection var ansible_shell_executable to /bin/sh 23826 1726867444.91916: Set connection var ansible_connection to ssh 23826 1726867444.91939: Set connection var ansible_pipelining to False 23826 1726867444.91947: Set connection var ansible_shell_type to sh 23826 1726867444.91958: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867444.91994: variable 'ansible_shell_executable' from source: unknown 23826 1726867444.92003: variable 'ansible_connection' from source: unknown 23826 1726867444.92016: variable 'ansible_module_compression' from source: unknown 23826 1726867444.92036: variable 'ansible_shell_type' from source: unknown 23826 1726867444.92182: variable 'ansible_shell_executable' from source: unknown 23826 1726867444.92185: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867444.92187: variable 'ansible_pipelining' from source: unknown 23826 1726867444.92189: variable 'ansible_timeout' from source: unknown 23826 1726867444.92191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867444.92268: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867444.92286: variable 'omit' from source: magic vars 23826 1726867444.92296: starting attempt loop 23826 1726867444.92312: running the handler 23826 1726867444.92333: variable 'ansible_facts' from source: unknown 23826 1726867444.92355: _low_level_execute_command(): starting 23826 1726867444.92366: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867444.93186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867444.93328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867444.93355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867444.93388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867444.93413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.93460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867444.95159: stdout chunk (state=3): >>>/root <<< 23826 1726867444.95256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867444.95289: stderr chunk (state=3): >>><<< 23826 1726867444.95294: stdout chunk (state=3): >>><<< 23826 1726867444.95318: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867444.95330: _low_level_execute_command(): starting 23826 1726867444.95337: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630 `" && echo ansible-tmp-1726867444.9531727-25183-100399944294630="` echo /root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630 `" ) && sleep 0' 23826 1726867444.95811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867444.95815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867444.95818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867444.95827: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867444.95829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867444.95882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867444.95885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867444.95887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.95922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867444.97884: stdout chunk (state=3): >>>ansible-tmp-1726867444.9531727-25183-100399944294630=/root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630 <<< 23826 1726867444.97992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867444.98021: stderr chunk (state=3): >>><<< 23826 1726867444.98024: stdout chunk (state=3): >>><<< 23826 1726867444.98037: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867444.9531727-25183-100399944294630=/root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867444.98070: variable 'ansible_module_compression' from source: unknown 23826 1726867444.98127: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 23826 1726867444.98189: variable 'ansible_facts' from source: unknown 23826 1726867444.98405: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/AnsiballZ_setup.py 23826 1726867444.98563: Sending initial data 23826 1726867444.98566: Sent initial data (154 bytes) 23826 1726867444.99258: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867444.99357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867444.99393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867445.01026: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 23826 1726867445.01031: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867445.01100: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867445.01106: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/AnsiballZ_setup.py" <<< 23826 1726867445.01112: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpcrioxyw4 /root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/AnsiballZ_setup.py <<< 23826 1726867445.01149: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpcrioxyw4" to remote "/root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/AnsiballZ_setup.py" <<< 23826 1726867445.02605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867445.02740: stderr chunk (state=3): >>><<< 23826 1726867445.02745: stdout chunk (state=3): >>><<< 23826 1726867445.02747: done transferring module to remote 23826 1726867445.02749: _low_level_execute_command(): starting 23826 1726867445.02751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/ /root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/AnsiballZ_setup.py && sleep 0' 23826 1726867445.03254: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867445.03350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867445.03422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867445.03457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867445.05266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867445.05281: stderr chunk (state=3): >>><<< 23826 1726867445.05288: stdout chunk (state=3): >>><<< 23826 1726867445.05302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867445.05305: _low_level_execute_command(): starting 23826 1726867445.05314: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/AnsiballZ_setup.py && sleep 0' 23826 1726867445.05860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867445.05864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867445.05866: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867445.05869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867445.05958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867445.05963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867445.06018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867446.72503: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.44873046875, "5m": 0.38330078125, "15m": 0.21923828125}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "24", "second": "05", "epoch": "1726867445", "epoch_int": "1726867445", "date": "2024-09-20", "time": "17:24:05", "iso8601_micro": "2024-09-20T21:24:05.340788Z", "iso8601": "2024-09-20T21:24:05Z", "iso8601_basic": "20240920T172405340788", "iso8601_basic_short": "20240920T172405", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRE<<< 23826 1726867446.72573: stdout chunk (state=3): >>>NT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 683, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794500608, "block_size": 4096, "block_total": 65519099, "block_available": 63914673, "block_used": 1604426, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["peerethtest0", "eth0", "ethtest0", "lo"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:b4:26:aa:e3:d8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4b4:26ff:feaa:e3d8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ba:29:a4:e0:1c:3b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b829:a4ff:fee0:1c3b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::f4b4:26ff:feaa:e3d8", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b", "fe80::f4b4:26ff:feaa:e3d8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 23826 1726867446.74685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867446.74767: stderr chunk (state=3): >>><<< 23826 1726867446.74968: stdout chunk (state=3): >>><<< 23826 1726867446.74974: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.44873046875, "5m": 0.38330078125, "15m": 0.21923828125}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "24", "second": "05", "epoch": "1726867445", "epoch_int": "1726867445", "date": "2024-09-20", "time": "17:24:05", "iso8601_micro": "2024-09-20T21:24:05.340788Z", "iso8601": "2024-09-20T21:24:05Z", "iso8601_basic": "20240920T172405340788", "iso8601_basic_short": "20240920T172405", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 683, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794500608, "block_size": 4096, "block_total": 65519099, "block_available": 63914673, "block_used": 1604426, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["peerethtest0", "eth0", "ethtest0", "lo"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:b4:26:aa:e3:d8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4b4:26ff:feaa:e3d8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ba:29:a4:e0:1c:3b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b829:a4ff:fee0:1c3b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::f4b4:26ff:feaa:e3d8", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b", "fe80::f4b4:26ff:feaa:e3d8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867446.75889: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867446.75920: _low_level_execute_command(): starting 23826 1726867446.75991: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867444.9531727-25183-100399944294630/ > /dev/null 2>&1 && sleep 0' 23826 1726867446.77332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867446.77358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867446.77441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867446.77514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867446.77536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867446.77698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867446.77728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867446.79699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867446.79702: stdout chunk (state=3): >>><<< 23826 1726867446.79847: stderr chunk (state=3): >>><<< 23826 1726867446.79850: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867446.79852: handler run complete 23826 1726867446.80087: variable 'ansible_facts' from source: unknown 23826 1726867446.80487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867446.80801: variable 'ansible_facts' from source: unknown 23826 1726867446.80902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867446.81152: attempt loop complete, returning result 23826 1726867446.81290: _execute() done 23826 1726867446.81683: dumping result to json 23826 1726867446.81686: done dumping result, returning 23826 1726867446.81688: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-a92d-a3ea-000000000521] 23826 1726867446.81691: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000521 ok: [managed_node2] 23826 1726867446.83091: no more pending results, returning what we have 23826 1726867446.83133: results queue empty 23826 1726867446.83134: checking for any_errors_fatal 23826 1726867446.83136: done checking for any_errors_fatal 23826 1726867446.83137: checking for max_fail_percentage 23826 1726867446.83139: done checking for max_fail_percentage 23826 1726867446.83139: checking to see if all hosts have failed and the running result is not ok 23826 1726867446.83141: done checking to see if all hosts have failed 23826 1726867446.83141: getting the remaining hosts for this loop 23826 1726867446.83143: done getting the remaining hosts for this loop 23826 1726867446.83146: getting the next task for host managed_node2 23826 1726867446.83152: done getting next task for host managed_node2 23826 1726867446.83154: ^ task is: TASK: meta (flush_handlers) 23826 1726867446.83156: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867446.83160: getting variables 23826 1726867446.83161: in VariableManager get_vars() 23826 1726867446.83195: Calling all_inventory to load vars for managed_node2 23826 1726867446.83198: Calling groups_inventory to load vars for managed_node2 23826 1726867446.83200: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867446.83209: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000521 23826 1726867446.83212: WORKER PROCESS EXITING 23826 1726867446.83223: Calling all_plugins_play to load vars for managed_node2 23826 1726867446.83226: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867446.83229: Calling groups_plugins_play to load vars for managed_node2 23826 1726867446.84642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867446.87046: done with get_vars() 23826 1726867446.87069: done getting variables 23826 1726867446.87143: in VariableManager get_vars() 23826 1726867446.87156: Calling all_inventory to load vars for managed_node2 23826 1726867446.87159: Calling groups_inventory to load vars for managed_node2 23826 1726867446.87161: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867446.87166: Calling all_plugins_play to load vars for managed_node2 23826 1726867446.87168: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867446.87171: Calling groups_plugins_play to load vars for managed_node2 23826 1726867446.88296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867446.89945: done with get_vars() 23826 1726867446.89976: done queuing things up, now waiting for results queue to drain 23826 1726867446.89981: results queue empty 23826 1726867446.89982: checking for any_errors_fatal 23826 1726867446.89986: done checking for any_errors_fatal 23826 1726867446.89987: checking for max_fail_percentage 23826 1726867446.89994: done checking for max_fail_percentage 23826 1726867446.89999: checking to see if all hosts have failed and the running result is not ok 23826 1726867446.90000: done checking to see if all hosts have failed 23826 1726867446.90001: getting the remaining hosts for this loop 23826 1726867446.90002: done getting the remaining hosts for this loop 23826 1726867446.90005: getting the next task for host managed_node2 23826 1726867446.90008: done getting next task for host managed_node2 23826 1726867446.90011: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 23826 1726867446.90013: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867446.90022: getting variables 23826 1726867446.90023: in VariableManager get_vars() 23826 1726867446.90036: Calling all_inventory to load vars for managed_node2 23826 1726867446.90038: Calling groups_inventory to load vars for managed_node2 23826 1726867446.90039: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867446.90044: Calling all_plugins_play to load vars for managed_node2 23826 1726867446.90046: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867446.90048: Calling groups_plugins_play to load vars for managed_node2 23826 1726867446.91165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867446.92053: done with get_vars() 23826 1726867446.92067: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:24:06 -0400 (0:00:02.021) 0:00:28.932 ****** 23826 1726867446.92126: entering _queue_task() for managed_node2/include_tasks 23826 1726867446.92381: worker is 1 (out of 1 available) 23826 1726867446.92394: exiting _queue_task() for managed_node2/include_tasks 23826 1726867446.92406: done queuing things up, now waiting for results queue to drain 23826 1726867446.92407: waiting for pending results... 23826 1726867446.92590: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 23826 1726867446.92692: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000084 23826 1726867446.92709: variable 'ansible_search_path' from source: unknown 23826 1726867446.92712: variable 'ansible_search_path' from source: unknown 23826 1726867446.92755: calling self._execute() 23826 1726867446.92876: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867446.92891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867446.92896: variable 'omit' from source: magic vars 23826 1726867446.93446: variable 'ansible_distribution_major_version' from source: facts 23826 1726867446.93683: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867446.93686: _execute() done 23826 1726867446.93693: dumping result to json 23826 1726867446.93695: done dumping result, returning 23826 1726867446.93698: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-a92d-a3ea-000000000084] 23826 1726867446.93700: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000084 23826 1726867446.93771: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000084 23826 1726867446.93774: WORKER PROCESS EXITING 23826 1726867446.93830: no more pending results, returning what we have 23826 1726867446.93834: in VariableManager get_vars() 23826 1726867446.93867: Calling all_inventory to load vars for managed_node2 23826 1726867446.93869: Calling groups_inventory to load vars for managed_node2 23826 1726867446.93871: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867446.93881: Calling all_plugins_play to load vars for managed_node2 23826 1726867446.93884: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867446.93886: Calling groups_plugins_play to load vars for managed_node2 23826 1726867446.95513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867446.96635: done with get_vars() 23826 1726867446.96650: variable 'ansible_search_path' from source: unknown 23826 1726867446.96650: variable 'ansible_search_path' from source: unknown 23826 1726867446.96669: we have included files to process 23826 1726867446.96670: generating all_blocks data 23826 1726867446.96671: done generating all_blocks data 23826 1726867446.96671: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867446.96672: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867446.96673: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 23826 1726867446.97039: done processing included file 23826 1726867446.97040: iterating over new_blocks loaded from include file 23826 1726867446.97041: in VariableManager get_vars() 23826 1726867446.97053: done with get_vars() 23826 1726867446.97054: filtering new block on tags 23826 1726867446.97064: done filtering new block on tags 23826 1726867446.97065: in VariableManager get_vars() 23826 1726867446.97081: done with get_vars() 23826 1726867446.97082: filtering new block on tags 23826 1726867446.97093: done filtering new block on tags 23826 1726867446.97095: in VariableManager get_vars() 23826 1726867446.97106: done with get_vars() 23826 1726867446.97107: filtering new block on tags 23826 1726867446.97117: done filtering new block on tags 23826 1726867446.97118: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 23826 1726867446.97122: extending task lists for all hosts with included blocks 23826 1726867446.97421: done extending task lists 23826 1726867446.97422: done processing included files 23826 1726867446.97423: results queue empty 23826 1726867446.97424: checking for any_errors_fatal 23826 1726867446.97425: done checking for any_errors_fatal 23826 1726867446.97426: checking for max_fail_percentage 23826 1726867446.97427: done checking for max_fail_percentage 23826 1726867446.97427: checking to see if all hosts have failed and the running result is not ok 23826 1726867446.97428: done checking to see if all hosts have failed 23826 1726867446.97429: getting the remaining hosts for this loop 23826 1726867446.97430: done getting the remaining hosts for this loop 23826 1726867446.97432: getting the next task for host managed_node2 23826 1726867446.97436: done getting next task for host managed_node2 23826 1726867446.97438: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 23826 1726867446.97441: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867446.97449: getting variables 23826 1726867446.97450: in VariableManager get_vars() 23826 1726867446.97462: Calling all_inventory to load vars for managed_node2 23826 1726867446.97464: Calling groups_inventory to load vars for managed_node2 23826 1726867446.97466: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867446.97471: Calling all_plugins_play to load vars for managed_node2 23826 1726867446.97474: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867446.97479: Calling groups_plugins_play to load vars for managed_node2 23826 1726867446.98531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867446.99388: done with get_vars() 23826 1726867446.99401: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:24:06 -0400 (0:00:00.073) 0:00:29.005 ****** 23826 1726867446.99451: entering _queue_task() for managed_node2/setup 23826 1726867446.99671: worker is 1 (out of 1 available) 23826 1726867446.99685: exiting _queue_task() for managed_node2/setup 23826 1726867446.99697: done queuing things up, now waiting for results queue to drain 23826 1726867446.99698: waiting for pending results... 23826 1726867446.99865: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 23826 1726867446.99949: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000562 23826 1726867446.99960: variable 'ansible_search_path' from source: unknown 23826 1726867446.99964: variable 'ansible_search_path' from source: unknown 23826 1726867446.99991: calling self._execute() 23826 1726867447.00059: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867447.00065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867447.00073: variable 'omit' from source: magic vars 23826 1726867447.00330: variable 'ansible_distribution_major_version' from source: facts 23826 1726867447.00339: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867447.00488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867447.01952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867447.01999: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867447.02027: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867447.02051: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867447.02071: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867447.02130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867447.02150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867447.02166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867447.02195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867447.02212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867447.02245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867447.02261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867447.02279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867447.02304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867447.02318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867447.02426: variable '__network_required_facts' from source: role '' defaults 23826 1726867447.02432: variable 'ansible_facts' from source: unknown 23826 1726867447.02847: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 23826 1726867447.02851: when evaluation is False, skipping this task 23826 1726867447.02856: _execute() done 23826 1726867447.02859: dumping result to json 23826 1726867447.02861: done dumping result, returning 23826 1726867447.02866: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-a92d-a3ea-000000000562] 23826 1726867447.02876: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000562 23826 1726867447.02947: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000562 23826 1726867447.02949: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867447.03028: no more pending results, returning what we have 23826 1726867447.03031: results queue empty 23826 1726867447.03032: checking for any_errors_fatal 23826 1726867447.03033: done checking for any_errors_fatal 23826 1726867447.03034: checking for max_fail_percentage 23826 1726867447.03036: done checking for max_fail_percentage 23826 1726867447.03037: checking to see if all hosts have failed and the running result is not ok 23826 1726867447.03038: done checking to see if all hosts have failed 23826 1726867447.03039: getting the remaining hosts for this loop 23826 1726867447.03040: done getting the remaining hosts for this loop 23826 1726867447.03043: getting the next task for host managed_node2 23826 1726867447.03049: done getting next task for host managed_node2 23826 1726867447.03052: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 23826 1726867447.03055: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867447.03067: getting variables 23826 1726867447.03068: in VariableManager get_vars() 23826 1726867447.03101: Calling all_inventory to load vars for managed_node2 23826 1726867447.03103: Calling groups_inventory to load vars for managed_node2 23826 1726867447.03105: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867447.03114: Calling all_plugins_play to load vars for managed_node2 23826 1726867447.03117: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867447.03119: Calling groups_plugins_play to load vars for managed_node2 23826 1726867447.03902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867447.04769: done with get_vars() 23826 1726867447.04785: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:24:07 -0400 (0:00:00.053) 0:00:29.059 ****** 23826 1726867447.04848: entering _queue_task() for managed_node2/stat 23826 1726867447.05041: worker is 1 (out of 1 available) 23826 1726867447.05053: exiting _queue_task() for managed_node2/stat 23826 1726867447.05065: done queuing things up, now waiting for results queue to drain 23826 1726867447.05066: waiting for pending results... 23826 1726867447.05247: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 23826 1726867447.05335: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000564 23826 1726867447.05346: variable 'ansible_search_path' from source: unknown 23826 1726867447.05349: variable 'ansible_search_path' from source: unknown 23826 1726867447.05384: calling self._execute() 23826 1726867447.05453: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867447.05457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867447.05466: variable 'omit' from source: magic vars 23826 1726867447.05728: variable 'ansible_distribution_major_version' from source: facts 23826 1726867447.05740: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867447.05852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867447.06057: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867447.06092: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867447.06119: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867447.06143: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867447.06205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867447.06225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867447.06243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867447.06261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867447.06325: variable '__network_is_ostree' from source: set_fact 23826 1726867447.06331: Evaluated conditional (not __network_is_ostree is defined): False 23826 1726867447.06334: when evaluation is False, skipping this task 23826 1726867447.06337: _execute() done 23826 1726867447.06339: dumping result to json 23826 1726867447.06343: done dumping result, returning 23826 1726867447.06350: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-a92d-a3ea-000000000564] 23826 1726867447.06354: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000564 23826 1726867447.06430: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000564 23826 1726867447.06432: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 23826 1726867447.06480: no more pending results, returning what we have 23826 1726867447.06484: results queue empty 23826 1726867447.06485: checking for any_errors_fatal 23826 1726867447.06490: done checking for any_errors_fatal 23826 1726867447.06491: checking for max_fail_percentage 23826 1726867447.06492: done checking for max_fail_percentage 23826 1726867447.06493: checking to see if all hosts have failed and the running result is not ok 23826 1726867447.06494: done checking to see if all hosts have failed 23826 1726867447.06495: getting the remaining hosts for this loop 23826 1726867447.06496: done getting the remaining hosts for this loop 23826 1726867447.06499: getting the next task for host managed_node2 23826 1726867447.06504: done getting next task for host managed_node2 23826 1726867447.06507: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 23826 1726867447.06510: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867447.06521: getting variables 23826 1726867447.06522: in VariableManager get_vars() 23826 1726867447.06550: Calling all_inventory to load vars for managed_node2 23826 1726867447.06552: Calling groups_inventory to load vars for managed_node2 23826 1726867447.06554: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867447.06561: Calling all_plugins_play to load vars for managed_node2 23826 1726867447.06564: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867447.06566: Calling groups_plugins_play to load vars for managed_node2 23826 1726867447.07297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867447.08248: done with get_vars() 23826 1726867447.08262: done getting variables 23826 1726867447.08300: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:24:07 -0400 (0:00:00.034) 0:00:29.094 ****** 23826 1726867447.08323: entering _queue_task() for managed_node2/set_fact 23826 1726867447.08515: worker is 1 (out of 1 available) 23826 1726867447.08528: exiting _queue_task() for managed_node2/set_fact 23826 1726867447.08539: done queuing things up, now waiting for results queue to drain 23826 1726867447.08540: waiting for pending results... 23826 1726867447.08711: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 23826 1726867447.08791: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000565 23826 1726867447.08803: variable 'ansible_search_path' from source: unknown 23826 1726867447.08806: variable 'ansible_search_path' from source: unknown 23826 1726867447.08834: calling self._execute() 23826 1726867447.08898: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867447.08904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867447.08916: variable 'omit' from source: magic vars 23826 1726867447.09171: variable 'ansible_distribution_major_version' from source: facts 23826 1726867447.09182: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867447.09294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867447.09479: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867447.09510: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867447.09537: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867447.09567: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867447.09624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867447.09642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867447.09661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867447.09683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867447.09743: variable '__network_is_ostree' from source: set_fact 23826 1726867447.09748: Evaluated conditional (not __network_is_ostree is defined): False 23826 1726867447.09751: when evaluation is False, skipping this task 23826 1726867447.09754: _execute() done 23826 1726867447.09758: dumping result to json 23826 1726867447.09761: done dumping result, returning 23826 1726867447.09768: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-a92d-a3ea-000000000565] 23826 1726867447.09772: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000565 23826 1726867447.09850: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000565 23826 1726867447.09853: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 23826 1726867447.09897: no more pending results, returning what we have 23826 1726867447.09900: results queue empty 23826 1726867447.09902: checking for any_errors_fatal 23826 1726867447.09906: done checking for any_errors_fatal 23826 1726867447.09907: checking for max_fail_percentage 23826 1726867447.09908: done checking for max_fail_percentage 23826 1726867447.09909: checking to see if all hosts have failed and the running result is not ok 23826 1726867447.09911: done checking to see if all hosts have failed 23826 1726867447.09911: getting the remaining hosts for this loop 23826 1726867447.09913: done getting the remaining hosts for this loop 23826 1726867447.09916: getting the next task for host managed_node2 23826 1726867447.09923: done getting next task for host managed_node2 23826 1726867447.09926: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 23826 1726867447.09928: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867447.09940: getting variables 23826 1726867447.09941: in VariableManager get_vars() 23826 1726867447.09968: Calling all_inventory to load vars for managed_node2 23826 1726867447.09970: Calling groups_inventory to load vars for managed_node2 23826 1726867447.09972: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867447.09986: Calling all_plugins_play to load vars for managed_node2 23826 1726867447.09989: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867447.09992: Calling groups_plugins_play to load vars for managed_node2 23826 1726867447.10721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867447.11583: done with get_vars() 23826 1726867447.11597: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:24:07 -0400 (0:00:00.033) 0:00:29.127 ****** 23826 1726867447.11658: entering _queue_task() for managed_node2/service_facts 23826 1726867447.11847: worker is 1 (out of 1 available) 23826 1726867447.11861: exiting _queue_task() for managed_node2/service_facts 23826 1726867447.11872: done queuing things up, now waiting for results queue to drain 23826 1726867447.11874: waiting for pending results... 23826 1726867447.12035: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 23826 1726867447.12107: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000567 23826 1726867447.12119: variable 'ansible_search_path' from source: unknown 23826 1726867447.12122: variable 'ansible_search_path' from source: unknown 23826 1726867447.12146: calling self._execute() 23826 1726867447.12212: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867447.12217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867447.12226: variable 'omit' from source: magic vars 23826 1726867447.12484: variable 'ansible_distribution_major_version' from source: facts 23826 1726867447.12493: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867447.12498: variable 'omit' from source: magic vars 23826 1726867447.12543: variable 'omit' from source: magic vars 23826 1726867447.12567: variable 'omit' from source: magic vars 23826 1726867447.12598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867447.12626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867447.12642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867447.12658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867447.12667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867447.12690: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867447.12693: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867447.12696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867447.12765: Set connection var ansible_timeout to 10 23826 1726867447.12779: Set connection var ansible_shell_executable to /bin/sh 23826 1726867447.12782: Set connection var ansible_connection to ssh 23826 1726867447.12789: Set connection var ansible_pipelining to False 23826 1726867447.12792: Set connection var ansible_shell_type to sh 23826 1726867447.12795: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867447.12815: variable 'ansible_shell_executable' from source: unknown 23826 1726867447.12818: variable 'ansible_connection' from source: unknown 23826 1726867447.12821: variable 'ansible_module_compression' from source: unknown 23826 1726867447.12823: variable 'ansible_shell_type' from source: unknown 23826 1726867447.12826: variable 'ansible_shell_executable' from source: unknown 23826 1726867447.12828: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867447.12830: variable 'ansible_pipelining' from source: unknown 23826 1726867447.12833: variable 'ansible_timeout' from source: unknown 23826 1726867447.12836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867447.12981: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867447.12987: variable 'omit' from source: magic vars 23826 1726867447.12990: starting attempt loop 23826 1726867447.12992: running the handler 23826 1726867447.13005: _low_level_execute_command(): starting 23826 1726867447.13012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867447.13514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867447.13518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867447.13520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867447.13523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867447.13591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867447.13595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867447.13651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867447.15410: stdout chunk (state=3): >>>/root <<< 23826 1726867447.15557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867447.15561: stdout chunk (state=3): >>><<< 23826 1726867447.15563: stderr chunk (state=3): >>><<< 23826 1726867447.15582: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867447.15676: _low_level_execute_command(): starting 23826 1726867447.15683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778 `" && echo ansible-tmp-1726867447.1559484-25291-155157033142778="` echo /root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778 `" ) && sleep 0' 23826 1726867447.16107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867447.16120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867447.16132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867447.16186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867447.16190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867447.16211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867447.16251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867447.18313: stdout chunk (state=3): >>>ansible-tmp-1726867447.1559484-25291-155157033142778=/root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778 <<< 23826 1726867447.18466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867447.18469: stdout chunk (state=3): >>><<< 23826 1726867447.18471: stderr chunk (state=3): >>><<< 23826 1726867447.18488: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867447.1559484-25291-155157033142778=/root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867447.18542: variable 'ansible_module_compression' from source: unknown 23826 1726867447.18627: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 23826 1726867447.18630: variable 'ansible_facts' from source: unknown 23826 1726867447.18716: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/AnsiballZ_service_facts.py 23826 1726867447.18862: Sending initial data 23826 1726867447.18976: Sent initial data (162 bytes) 23826 1726867447.19593: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867447.19658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867447.19709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867447.19794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867447.21463: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 23826 1726867447.21491: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867447.21532: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867447.21596: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp5z_nldtk /root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/AnsiballZ_service_facts.py <<< 23826 1726867447.21600: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/AnsiballZ_service_facts.py" <<< 23826 1726867447.21637: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp5z_nldtk" to remote "/root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/AnsiballZ_service_facts.py" <<< 23826 1726867447.22420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867447.22457: stderr chunk (state=3): >>><<< 23826 1726867447.22559: stdout chunk (state=3): >>><<< 23826 1726867447.22569: done transferring module to remote 23826 1726867447.22587: _low_level_execute_command(): starting 23826 1726867447.22595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/ /root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/AnsiballZ_service_facts.py && sleep 0' 23826 1726867447.23329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867447.23340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867447.23363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867447.23448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867447.25389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867447.25404: stdout chunk (state=3): >>><<< 23826 1726867447.25416: stderr chunk (state=3): >>><<< 23826 1726867447.25437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867447.25446: _low_level_execute_command(): starting 23826 1726867447.25455: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/AnsiballZ_service_facts.py && sleep 0' 23826 1726867447.26041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867447.26056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867447.26069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867447.26088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867447.26144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867447.26201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867447.26218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867447.26247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867447.26326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867448.85969: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 23826 1726867448.85975: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 23826 1726867448.86015: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 23826 1726867448.86031: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 23826 1726867448.86035: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 23826 1726867448.86058: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 23826 1726867448.87625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867448.87658: stderr chunk (state=3): >>><<< 23826 1726867448.87662: stdout chunk (state=3): >>><<< 23826 1726867448.87704: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867448.88749: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867448.88753: _low_level_execute_command(): starting 23826 1726867448.88755: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867447.1559484-25291-155157033142778/ > /dev/null 2>&1 && sleep 0' 23826 1726867448.89210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867448.89214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867448.89216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867448.89219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867448.89220: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867448.89272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867448.89281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867448.89284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867448.89318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867448.91175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867448.91200: stderr chunk (state=3): >>><<< 23826 1726867448.91203: stdout chunk (state=3): >>><<< 23826 1726867448.91217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867448.91221: handler run complete 23826 1726867448.91335: variable 'ansible_facts' from source: unknown 23826 1726867448.91432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867448.91704: variable 'ansible_facts' from source: unknown 23826 1726867448.91781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867448.91895: attempt loop complete, returning result 23826 1726867448.91898: _execute() done 23826 1726867448.91901: dumping result to json 23826 1726867448.91939: done dumping result, returning 23826 1726867448.91945: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-a92d-a3ea-000000000567] 23826 1726867448.91949: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000567 23826 1726867448.92688: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000567 23826 1726867448.92691: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867448.92741: no more pending results, returning what we have 23826 1726867448.92743: results queue empty 23826 1726867448.92744: checking for any_errors_fatal 23826 1726867448.92746: done checking for any_errors_fatal 23826 1726867448.92746: checking for max_fail_percentage 23826 1726867448.92747: done checking for max_fail_percentage 23826 1726867448.92748: checking to see if all hosts have failed and the running result is not ok 23826 1726867448.92749: done checking to see if all hosts have failed 23826 1726867448.92749: getting the remaining hosts for this loop 23826 1726867448.92750: done getting the remaining hosts for this loop 23826 1726867448.92752: getting the next task for host managed_node2 23826 1726867448.92756: done getting next task for host managed_node2 23826 1726867448.92758: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 23826 1726867448.92760: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867448.92766: getting variables 23826 1726867448.92766: in VariableManager get_vars() 23826 1726867448.92789: Calling all_inventory to load vars for managed_node2 23826 1726867448.92791: Calling groups_inventory to load vars for managed_node2 23826 1726867448.92792: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867448.92798: Calling all_plugins_play to load vars for managed_node2 23826 1726867448.92800: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867448.92802: Calling groups_plugins_play to load vars for managed_node2 23826 1726867448.93475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867448.94347: done with get_vars() 23826 1726867448.94363: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:24:08 -0400 (0:00:01.827) 0:00:30.955 ****** 23826 1726867448.94433: entering _queue_task() for managed_node2/package_facts 23826 1726867448.94679: worker is 1 (out of 1 available) 23826 1726867448.94692: exiting _queue_task() for managed_node2/package_facts 23826 1726867448.94704: done queuing things up, now waiting for results queue to drain 23826 1726867448.94706: waiting for pending results... 23826 1726867448.94887: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 23826 1726867448.94971: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000568 23826 1726867448.94988: variable 'ansible_search_path' from source: unknown 23826 1726867448.94991: variable 'ansible_search_path' from source: unknown 23826 1726867448.95020: calling self._execute() 23826 1726867448.95094: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867448.95097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867448.95113: variable 'omit' from source: magic vars 23826 1726867448.95389: variable 'ansible_distribution_major_version' from source: facts 23826 1726867448.95398: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867448.95404: variable 'omit' from source: magic vars 23826 1726867448.95453: variable 'omit' from source: magic vars 23826 1726867448.95479: variable 'omit' from source: magic vars 23826 1726867448.95520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867448.95546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867448.95561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867448.95574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867448.95593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867448.95614: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867448.95618: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867448.95620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867448.95689: Set connection var ansible_timeout to 10 23826 1726867448.95701: Set connection var ansible_shell_executable to /bin/sh 23826 1726867448.95704: Set connection var ansible_connection to ssh 23826 1726867448.95706: Set connection var ansible_pipelining to False 23826 1726867448.95708: Set connection var ansible_shell_type to sh 23826 1726867448.95714: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867448.95732: variable 'ansible_shell_executable' from source: unknown 23826 1726867448.95735: variable 'ansible_connection' from source: unknown 23826 1726867448.95738: variable 'ansible_module_compression' from source: unknown 23826 1726867448.95740: variable 'ansible_shell_type' from source: unknown 23826 1726867448.95742: variable 'ansible_shell_executable' from source: unknown 23826 1726867448.95744: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867448.95749: variable 'ansible_pipelining' from source: unknown 23826 1726867448.95751: variable 'ansible_timeout' from source: unknown 23826 1726867448.95755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867448.95898: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867448.95908: variable 'omit' from source: magic vars 23826 1726867448.95918: starting attempt loop 23826 1726867448.95921: running the handler 23826 1726867448.95931: _low_level_execute_command(): starting 23826 1726867448.95937: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867448.96445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867448.96449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867448.96453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867448.96458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867448.96509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867448.96513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867448.96517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867448.96561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867448.98217: stdout chunk (state=3): >>>/root <<< 23826 1726867448.98318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867448.98344: stderr chunk (state=3): >>><<< 23826 1726867448.98348: stdout chunk (state=3): >>><<< 23826 1726867448.98365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867448.98376: _low_level_execute_command(): starting 23826 1726867448.98384: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436 `" && echo ansible-tmp-1726867448.983645-25363-156371970323436="` echo /root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436 `" ) && sleep 0' 23826 1726867448.98803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867448.98807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867448.98809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867448.98821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867448.98823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867448.98865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867448.98868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867448.98917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867449.00838: stdout chunk (state=3): >>>ansible-tmp-1726867448.983645-25363-156371970323436=/root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436 <<< 23826 1726867449.00945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867449.00967: stderr chunk (state=3): >>><<< 23826 1726867449.00972: stdout chunk (state=3): >>><<< 23826 1726867449.00987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867448.983645-25363-156371970323436=/root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867449.01022: variable 'ansible_module_compression' from source: unknown 23826 1726867449.01056: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 23826 1726867449.01110: variable 'ansible_facts' from source: unknown 23826 1726867449.01228: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/AnsiballZ_package_facts.py 23826 1726867449.01328: Sending initial data 23826 1726867449.01332: Sent initial data (161 bytes) 23826 1726867449.01734: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867449.01768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867449.01771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867449.01773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867449.01775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867449.01828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867449.01834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867449.01836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867449.01876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867449.03495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 23826 1726867449.03498: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867449.03533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867449.03575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpt3ik_dor /root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/AnsiballZ_package_facts.py <<< 23826 1726867449.03579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/AnsiballZ_package_facts.py" <<< 23826 1726867449.03610: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpt3ik_dor" to remote "/root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/AnsiballZ_package_facts.py" <<< 23826 1726867449.04679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867449.04713: stderr chunk (state=3): >>><<< 23826 1726867449.04716: stdout chunk (state=3): >>><<< 23826 1726867449.04751: done transferring module to remote 23826 1726867449.04758: _low_level_execute_command(): starting 23826 1726867449.04762: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/ /root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/AnsiballZ_package_facts.py && sleep 0' 23826 1726867449.05161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867449.05165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867449.05178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867449.05237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867449.05244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867449.05282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867449.07128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867449.07151: stderr chunk (state=3): >>><<< 23826 1726867449.07156: stdout chunk (state=3): >>><<< 23826 1726867449.07165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867449.07169: _low_level_execute_command(): starting 23826 1726867449.07171: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/AnsiballZ_package_facts.py && sleep 0' 23826 1726867449.07550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867449.07554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867449.07564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867449.07622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867449.07627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867449.07670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867449.52539: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 23826 1726867449.52571: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 23826 1726867449.52632: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 23826 1726867449.52670: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 23826 1726867449.52755: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 23826 1726867449.52760: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 23826 1726867449.54668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867449.54671: stdout chunk (state=3): >>><<< 23826 1726867449.54673: stderr chunk (state=3): >>><<< 23826 1726867449.54702: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867449.57919: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867449.57923: _low_level_execute_command(): starting 23826 1726867449.57926: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867448.983645-25363-156371970323436/ > /dev/null 2>&1 && sleep 0' 23826 1726867449.58588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867449.58653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867449.58683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867449.58725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867449.58762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867449.60624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867449.60645: stderr chunk (state=3): >>><<< 23826 1726867449.60648: stdout chunk (state=3): >>><<< 23826 1726867449.60659: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867449.60665: handler run complete 23826 1726867449.61114: variable 'ansible_facts' from source: unknown 23826 1726867449.61366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867449.66122: variable 'ansible_facts' from source: unknown 23826 1726867449.66350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867449.66728: attempt loop complete, returning result 23826 1726867449.66737: _execute() done 23826 1726867449.66740: dumping result to json 23826 1726867449.66853: done dumping result, returning 23826 1726867449.66859: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-a92d-a3ea-000000000568] 23826 1726867449.66864: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000568 23826 1726867449.68108: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000568 23826 1726867449.68112: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867449.68193: no more pending results, returning what we have 23826 1726867449.68195: results queue empty 23826 1726867449.68195: checking for any_errors_fatal 23826 1726867449.68198: done checking for any_errors_fatal 23826 1726867449.68199: checking for max_fail_percentage 23826 1726867449.68200: done checking for max_fail_percentage 23826 1726867449.68200: checking to see if all hosts have failed and the running result is not ok 23826 1726867449.68201: done checking to see if all hosts have failed 23826 1726867449.68201: getting the remaining hosts for this loop 23826 1726867449.68202: done getting the remaining hosts for this loop 23826 1726867449.68205: getting the next task for host managed_node2 23826 1726867449.68211: done getting next task for host managed_node2 23826 1726867449.68213: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 23826 1726867449.68215: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867449.68220: getting variables 23826 1726867449.68221: in VariableManager get_vars() 23826 1726867449.68245: Calling all_inventory to load vars for managed_node2 23826 1726867449.68247: Calling groups_inventory to load vars for managed_node2 23826 1726867449.68248: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867449.68254: Calling all_plugins_play to load vars for managed_node2 23826 1726867449.68256: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867449.68257: Calling groups_plugins_play to load vars for managed_node2 23826 1726867449.72025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867449.72883: done with get_vars() 23826 1726867449.72899: done getting variables 23826 1726867449.72934: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:24:09 -0400 (0:00:00.785) 0:00:31.740 ****** 23826 1726867449.72952: entering _queue_task() for managed_node2/debug 23826 1726867449.73206: worker is 1 (out of 1 available) 23826 1726867449.73222: exiting _queue_task() for managed_node2/debug 23826 1726867449.73232: done queuing things up, now waiting for results queue to drain 23826 1726867449.73233: waiting for pending results... 23826 1726867449.73412: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 23826 1726867449.73475: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000085 23826 1726867449.73487: variable 'ansible_search_path' from source: unknown 23826 1726867449.73491: variable 'ansible_search_path' from source: unknown 23826 1726867449.73520: calling self._execute() 23826 1726867449.73592: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867449.73597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867449.73606: variable 'omit' from source: magic vars 23826 1726867449.73879: variable 'ansible_distribution_major_version' from source: facts 23826 1726867449.73889: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867449.73896: variable 'omit' from source: magic vars 23826 1726867449.73922: variable 'omit' from source: magic vars 23826 1726867449.73993: variable 'network_provider' from source: set_fact 23826 1726867449.74011: variable 'omit' from source: magic vars 23826 1726867449.74041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867449.74069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867449.74087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867449.74101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867449.74115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867449.74165: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867449.74169: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867449.74172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867449.74247: Set connection var ansible_timeout to 10 23826 1726867449.74254: Set connection var ansible_shell_executable to /bin/sh 23826 1726867449.74257: Set connection var ansible_connection to ssh 23826 1726867449.74264: Set connection var ansible_pipelining to False 23826 1726867449.74268: Set connection var ansible_shell_type to sh 23826 1726867449.74270: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867449.74294: variable 'ansible_shell_executable' from source: unknown 23826 1726867449.74297: variable 'ansible_connection' from source: unknown 23826 1726867449.74300: variable 'ansible_module_compression' from source: unknown 23826 1726867449.74302: variable 'ansible_shell_type' from source: unknown 23826 1726867449.74305: variable 'ansible_shell_executable' from source: unknown 23826 1726867449.74307: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867449.74309: variable 'ansible_pipelining' from source: unknown 23826 1726867449.74315: variable 'ansible_timeout' from source: unknown 23826 1726867449.74317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867449.74418: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867449.74428: variable 'omit' from source: magic vars 23826 1726867449.74432: starting attempt loop 23826 1726867449.74437: running the handler 23826 1726867449.74472: handler run complete 23826 1726867449.74483: attempt loop complete, returning result 23826 1726867449.74487: _execute() done 23826 1726867449.74491: dumping result to json 23826 1726867449.74493: done dumping result, returning 23826 1726867449.74501: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-a92d-a3ea-000000000085] 23826 1726867449.74504: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000085 23826 1726867449.74581: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000085 23826 1726867449.74584: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 23826 1726867449.74641: no more pending results, returning what we have 23826 1726867449.74644: results queue empty 23826 1726867449.74645: checking for any_errors_fatal 23826 1726867449.74656: done checking for any_errors_fatal 23826 1726867449.74656: checking for max_fail_percentage 23826 1726867449.74658: done checking for max_fail_percentage 23826 1726867449.74659: checking to see if all hosts have failed and the running result is not ok 23826 1726867449.74660: done checking to see if all hosts have failed 23826 1726867449.74661: getting the remaining hosts for this loop 23826 1726867449.74662: done getting the remaining hosts for this loop 23826 1726867449.74666: getting the next task for host managed_node2 23826 1726867449.74671: done getting next task for host managed_node2 23826 1726867449.74675: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 23826 1726867449.74679: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867449.74689: getting variables 23826 1726867449.74690: in VariableManager get_vars() 23826 1726867449.74727: Calling all_inventory to load vars for managed_node2 23826 1726867449.74730: Calling groups_inventory to load vars for managed_node2 23826 1726867449.74732: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867449.74740: Calling all_plugins_play to load vars for managed_node2 23826 1726867449.74742: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867449.74745: Calling groups_plugins_play to load vars for managed_node2 23826 1726867449.75486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867449.76854: done with get_vars() 23826 1726867449.76870: done getting variables 23826 1726867449.76910: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:24:09 -0400 (0:00:00.039) 0:00:31.780 ****** 23826 1726867449.76930: entering _queue_task() for managed_node2/fail 23826 1726867449.77137: worker is 1 (out of 1 available) 23826 1726867449.77150: exiting _queue_task() for managed_node2/fail 23826 1726867449.77161: done queuing things up, now waiting for results queue to drain 23826 1726867449.77162: waiting for pending results... 23826 1726867449.77332: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 23826 1726867449.77399: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000086 23826 1726867449.77411: variable 'ansible_search_path' from source: unknown 23826 1726867449.77416: variable 'ansible_search_path' from source: unknown 23826 1726867449.77443: calling self._execute() 23826 1726867449.77517: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867449.77523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867449.77532: variable 'omit' from source: magic vars 23826 1726867449.77790: variable 'ansible_distribution_major_version' from source: facts 23826 1726867449.77799: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867449.77882: variable 'network_state' from source: role '' defaults 23826 1726867449.77892: Evaluated conditional (network_state != {}): False 23826 1726867449.77895: when evaluation is False, skipping this task 23826 1726867449.77899: _execute() done 23826 1726867449.77901: dumping result to json 23826 1726867449.77904: done dumping result, returning 23826 1726867449.77910: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-a92d-a3ea-000000000086] 23826 1726867449.77917: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000086 23826 1726867449.78006: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000086 23826 1726867449.78009: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867449.78069: no more pending results, returning what we have 23826 1726867449.78072: results queue empty 23826 1726867449.78073: checking for any_errors_fatal 23826 1726867449.78084: done checking for any_errors_fatal 23826 1726867449.78085: checking for max_fail_percentage 23826 1726867449.78087: done checking for max_fail_percentage 23826 1726867449.78087: checking to see if all hosts have failed and the running result is not ok 23826 1726867449.78088: done checking to see if all hosts have failed 23826 1726867449.78089: getting the remaining hosts for this loop 23826 1726867449.78090: done getting the remaining hosts for this loop 23826 1726867449.78093: getting the next task for host managed_node2 23826 1726867449.78097: done getting next task for host managed_node2 23826 1726867449.78100: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 23826 1726867449.78102: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867449.78114: getting variables 23826 1726867449.78116: in VariableManager get_vars() 23826 1726867449.78143: Calling all_inventory to load vars for managed_node2 23826 1726867449.78146: Calling groups_inventory to load vars for managed_node2 23826 1726867449.78148: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867449.78154: Calling all_plugins_play to load vars for managed_node2 23826 1726867449.78157: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867449.78158: Calling groups_plugins_play to load vars for managed_node2 23826 1726867449.79214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867449.80711: done with get_vars() 23826 1726867449.80734: done getting variables 23826 1726867449.80792: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:24:09 -0400 (0:00:00.038) 0:00:31.819 ****** 23826 1726867449.80820: entering _queue_task() for managed_node2/fail 23826 1726867449.81082: worker is 1 (out of 1 available) 23826 1726867449.81096: exiting _queue_task() for managed_node2/fail 23826 1726867449.81108: done queuing things up, now waiting for results queue to drain 23826 1726867449.81110: waiting for pending results... 23826 1726867449.81499: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 23826 1726867449.81504: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000087 23826 1726867449.81523: variable 'ansible_search_path' from source: unknown 23826 1726867449.81532: variable 'ansible_search_path' from source: unknown 23826 1726867449.81571: calling self._execute() 23826 1726867449.81670: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867449.81685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867449.81707: variable 'omit' from source: magic vars 23826 1726867449.82070: variable 'ansible_distribution_major_version' from source: facts 23826 1726867449.82090: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867449.82209: variable 'network_state' from source: role '' defaults 23826 1726867449.82227: Evaluated conditional (network_state != {}): False 23826 1726867449.82234: when evaluation is False, skipping this task 23826 1726867449.82244: _execute() done 23826 1726867449.82254: dumping result to json 23826 1726867449.82261: done dumping result, returning 23826 1726867449.82273: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-a92d-a3ea-000000000087] 23826 1726867449.82285: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000087 23826 1726867449.82507: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000087 23826 1726867449.82510: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867449.82554: no more pending results, returning what we have 23826 1726867449.82557: results queue empty 23826 1726867449.82558: checking for any_errors_fatal 23826 1726867449.82565: done checking for any_errors_fatal 23826 1726867449.82566: checking for max_fail_percentage 23826 1726867449.82568: done checking for max_fail_percentage 23826 1726867449.82569: checking to see if all hosts have failed and the running result is not ok 23826 1726867449.82570: done checking to see if all hosts have failed 23826 1726867449.82571: getting the remaining hosts for this loop 23826 1726867449.82572: done getting the remaining hosts for this loop 23826 1726867449.82575: getting the next task for host managed_node2 23826 1726867449.82583: done getting next task for host managed_node2 23826 1726867449.82587: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 23826 1726867449.82589: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867449.82602: getting variables 23826 1726867449.82604: in VariableManager get_vars() 23826 1726867449.82638: Calling all_inventory to load vars for managed_node2 23826 1726867449.82640: Calling groups_inventory to load vars for managed_node2 23826 1726867449.82643: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867449.82653: Calling all_plugins_play to load vars for managed_node2 23826 1726867449.82656: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867449.82659: Calling groups_plugins_play to load vars for managed_node2 23826 1726867449.84011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867449.85687: done with get_vars() 23826 1726867449.85708: done getting variables 23826 1726867449.85764: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:24:09 -0400 (0:00:00.049) 0:00:31.869 ****** 23826 1726867449.85796: entering _queue_task() for managed_node2/fail 23826 1726867449.86069: worker is 1 (out of 1 available) 23826 1726867449.86084: exiting _queue_task() for managed_node2/fail 23826 1726867449.86096: done queuing things up, now waiting for results queue to drain 23826 1726867449.86098: waiting for pending results... 23826 1726867449.86497: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 23826 1726867449.86502: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000088 23826 1726867449.86508: variable 'ansible_search_path' from source: unknown 23826 1726867449.86517: variable 'ansible_search_path' from source: unknown 23826 1726867449.86558: calling self._execute() 23826 1726867449.86661: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867449.86675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867449.86693: variable 'omit' from source: magic vars 23826 1726867449.87068: variable 'ansible_distribution_major_version' from source: facts 23826 1726867449.87087: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867449.87264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867449.89450: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867449.89531: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867449.89576: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867449.89619: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867449.89656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867449.89735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867449.89773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867449.89806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.89851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867449.89876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867449.90084: variable 'ansible_distribution_major_version' from source: facts 23826 1726867449.90087: Evaluated conditional (ansible_distribution_major_version | int > 9): True 23826 1726867449.90114: variable 'ansible_distribution' from source: facts 23826 1726867449.90124: variable '__network_rh_distros' from source: role '' defaults 23826 1726867449.90139: Evaluated conditional (ansible_distribution in __network_rh_distros): True 23826 1726867449.90394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867449.90427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867449.90456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.90502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867449.90527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867449.90581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867449.90611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867449.90645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.90691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867449.90711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867449.90761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867449.90792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867449.90821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.90869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867449.90951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867449.91229: variable 'network_connections' from source: play vars 23826 1726867449.91245: variable 'profile' from source: play vars 23826 1726867449.91321: variable 'profile' from source: play vars 23826 1726867449.91332: variable 'interface' from source: set_fact 23826 1726867449.91398: variable 'interface' from source: set_fact 23826 1726867449.91416: variable 'network_state' from source: role '' defaults 23826 1726867449.91489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867449.91679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867449.91724: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867449.91783: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867449.91793: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867449.91842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867449.91932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867449.91935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.91940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867449.91970: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 23826 1726867449.91981: when evaluation is False, skipping this task 23826 1726867449.91989: _execute() done 23826 1726867449.91996: dumping result to json 23826 1726867449.92004: done dumping result, returning 23826 1726867449.92015: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-a92d-a3ea-000000000088] 23826 1726867449.92025: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000088 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 23826 1726867449.92326: no more pending results, returning what we have 23826 1726867449.92330: results queue empty 23826 1726867449.92331: checking for any_errors_fatal 23826 1726867449.92338: done checking for any_errors_fatal 23826 1726867449.92339: checking for max_fail_percentage 23826 1726867449.92341: done checking for max_fail_percentage 23826 1726867449.92342: checking to see if all hosts have failed and the running result is not ok 23826 1726867449.92343: done checking to see if all hosts have failed 23826 1726867449.92344: getting the remaining hosts for this loop 23826 1726867449.92346: done getting the remaining hosts for this loop 23826 1726867449.92349: getting the next task for host managed_node2 23826 1726867449.92355: done getting next task for host managed_node2 23826 1726867449.92359: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 23826 1726867449.92361: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867449.92374: getting variables 23826 1726867449.92375: in VariableManager get_vars() 23826 1726867449.92413: Calling all_inventory to load vars for managed_node2 23826 1726867449.92416: Calling groups_inventory to load vars for managed_node2 23826 1726867449.92418: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867449.92428: Calling all_plugins_play to load vars for managed_node2 23826 1726867449.92431: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867449.92433: Calling groups_plugins_play to load vars for managed_node2 23826 1726867449.92990: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000088 23826 1726867449.92994: WORKER PROCESS EXITING 23826 1726867449.93924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867449.95453: done with get_vars() 23826 1726867449.95479: done getting variables 23826 1726867449.95536: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:24:09 -0400 (0:00:00.097) 0:00:31.967 ****** 23826 1726867449.95568: entering _queue_task() for managed_node2/dnf 23826 1726867449.95843: worker is 1 (out of 1 available) 23826 1726867449.95857: exiting _queue_task() for managed_node2/dnf 23826 1726867449.95869: done queuing things up, now waiting for results queue to drain 23826 1726867449.95870: waiting for pending results... 23826 1726867449.96057: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 23826 1726867449.96127: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000089 23826 1726867449.96138: variable 'ansible_search_path' from source: unknown 23826 1726867449.96143: variable 'ansible_search_path' from source: unknown 23826 1726867449.96170: calling self._execute() 23826 1726867449.96247: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867449.96251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867449.96260: variable 'omit' from source: magic vars 23826 1726867449.96536: variable 'ansible_distribution_major_version' from source: facts 23826 1726867449.96540: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867449.96674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867449.98596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867449.99086: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867449.99127: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867449.99139: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867449.99162: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867449.99219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867449.99241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867449.99261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.99292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867449.99303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867449.99388: variable 'ansible_distribution' from source: facts 23826 1726867449.99392: variable 'ansible_distribution_major_version' from source: facts 23826 1726867449.99404: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 23826 1726867449.99481: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867449.99566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867449.99672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867449.99675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.99679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867449.99682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867449.99686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867449.99689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867449.99696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.99724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867449.99734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867449.99761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867449.99782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867449.99799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867449.99828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867449.99838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867449.99938: variable 'network_connections' from source: play vars 23826 1726867449.99949: variable 'profile' from source: play vars 23826 1726867449.99998: variable 'profile' from source: play vars 23826 1726867450.00002: variable 'interface' from source: set_fact 23826 1726867450.00048: variable 'interface' from source: set_fact 23826 1726867450.00101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867450.00226: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867450.00271: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867450.00383: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867450.00386: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867450.00398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867450.00427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867450.00465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.00501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867450.00550: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867450.00786: variable 'network_connections' from source: play vars 23826 1726867450.00797: variable 'profile' from source: play vars 23826 1726867450.00904: variable 'profile' from source: play vars 23826 1726867450.00913: variable 'interface' from source: set_fact 23826 1726867450.00989: variable 'interface' from source: set_fact 23826 1726867450.01021: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867450.01038: when evaluation is False, skipping this task 23826 1726867450.01087: _execute() done 23826 1726867450.01090: dumping result to json 23826 1726867450.01092: done dumping result, returning 23826 1726867450.01094: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-000000000089] 23826 1726867450.01096: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000089 23826 1726867450.01234: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000089 23826 1726867450.01237: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867450.01293: no more pending results, returning what we have 23826 1726867450.01298: results queue empty 23826 1726867450.01299: checking for any_errors_fatal 23826 1726867450.01305: done checking for any_errors_fatal 23826 1726867450.01305: checking for max_fail_percentage 23826 1726867450.01307: done checking for max_fail_percentage 23826 1726867450.01309: checking to see if all hosts have failed and the running result is not ok 23826 1726867450.01310: done checking to see if all hosts have failed 23826 1726867450.01310: getting the remaining hosts for this loop 23826 1726867450.01312: done getting the remaining hosts for this loop 23826 1726867450.01316: getting the next task for host managed_node2 23826 1726867450.01323: done getting next task for host managed_node2 23826 1726867450.01326: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 23826 1726867450.01329: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867450.01342: getting variables 23826 1726867450.01343: in VariableManager get_vars() 23826 1726867450.01384: Calling all_inventory to load vars for managed_node2 23826 1726867450.01388: Calling groups_inventory to load vars for managed_node2 23826 1726867450.01390: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867450.01401: Calling all_plugins_play to load vars for managed_node2 23826 1726867450.01404: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867450.01406: Calling groups_plugins_play to load vars for managed_node2 23826 1726867450.03348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867450.05425: done with get_vars() 23826 1726867450.05448: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 23826 1726867450.05528: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:24:10 -0400 (0:00:00.099) 0:00:32.066 ****** 23826 1726867450.05558: entering _queue_task() for managed_node2/yum 23826 1726867450.05858: worker is 1 (out of 1 available) 23826 1726867450.05868: exiting _queue_task() for managed_node2/yum 23826 1726867450.05882: done queuing things up, now waiting for results queue to drain 23826 1726867450.05883: waiting for pending results... 23826 1726867450.06211: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 23826 1726867450.06284: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000008a 23826 1726867450.06305: variable 'ansible_search_path' from source: unknown 23826 1726867450.06308: variable 'ansible_search_path' from source: unknown 23826 1726867450.06340: calling self._execute() 23826 1726867450.06420: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.06424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.06434: variable 'omit' from source: magic vars 23826 1726867450.06784: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.06787: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867450.06916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867450.10016: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867450.10093: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867450.10137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867450.10176: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867450.10382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867450.10385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.10388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.10390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.10393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.10411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.10505: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.10529: Evaluated conditional (ansible_distribution_major_version | int < 8): False 23826 1726867450.10536: when evaluation is False, skipping this task 23826 1726867450.10543: _execute() done 23826 1726867450.10549: dumping result to json 23826 1726867450.10556: done dumping result, returning 23826 1726867450.10566: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000008a] 23826 1726867450.10576: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008a 23826 1726867450.10685: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008a 23826 1726867450.10692: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 23826 1726867450.10743: no more pending results, returning what we have 23826 1726867450.10746: results queue empty 23826 1726867450.10747: checking for any_errors_fatal 23826 1726867450.10753: done checking for any_errors_fatal 23826 1726867450.10754: checking for max_fail_percentage 23826 1726867450.10756: done checking for max_fail_percentage 23826 1726867450.10757: checking to see if all hosts have failed and the running result is not ok 23826 1726867450.10757: done checking to see if all hosts have failed 23826 1726867450.10758: getting the remaining hosts for this loop 23826 1726867450.10759: done getting the remaining hosts for this loop 23826 1726867450.10763: getting the next task for host managed_node2 23826 1726867450.10768: done getting next task for host managed_node2 23826 1726867450.10772: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 23826 1726867450.10774: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867450.10788: getting variables 23826 1726867450.10789: in VariableManager get_vars() 23826 1726867450.10831: Calling all_inventory to load vars for managed_node2 23826 1726867450.10834: Calling groups_inventory to load vars for managed_node2 23826 1726867450.10836: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867450.10845: Calling all_plugins_play to load vars for managed_node2 23826 1726867450.10848: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867450.10850: Calling groups_plugins_play to load vars for managed_node2 23826 1726867450.12356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867450.13955: done with get_vars() 23826 1726867450.13976: done getting variables 23826 1726867450.14038: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:24:10 -0400 (0:00:00.085) 0:00:32.152 ****** 23826 1726867450.14070: entering _queue_task() for managed_node2/fail 23826 1726867450.14359: worker is 1 (out of 1 available) 23826 1726867450.14370: exiting _queue_task() for managed_node2/fail 23826 1726867450.14484: done queuing things up, now waiting for results queue to drain 23826 1726867450.14486: waiting for pending results... 23826 1726867450.14667: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 23826 1726867450.14780: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000008b 23826 1726867450.14799: variable 'ansible_search_path' from source: unknown 23826 1726867450.14818: variable 'ansible_search_path' from source: unknown 23826 1726867450.14851: calling self._execute() 23826 1726867450.14983: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.14987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.14989: variable 'omit' from source: magic vars 23826 1726867450.15359: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.15581: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867450.15585: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867450.15700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867450.18141: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867450.18218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867450.18269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867450.18317: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867450.18351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867450.18443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.18480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.18516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.18567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.18590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.18650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.18682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.18715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.18764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.18787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.18834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.18961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.18964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.18967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.18969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.19140: variable 'network_connections' from source: play vars 23826 1726867450.19157: variable 'profile' from source: play vars 23826 1726867450.19238: variable 'profile' from source: play vars 23826 1726867450.19247: variable 'interface' from source: set_fact 23826 1726867450.19316: variable 'interface' from source: set_fact 23826 1726867450.19398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867450.19570: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867450.19619: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867450.19654: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867450.19700: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867450.19751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867450.19779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867450.19810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.19842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867450.19935: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867450.20152: variable 'network_connections' from source: play vars 23826 1726867450.20162: variable 'profile' from source: play vars 23826 1726867450.20224: variable 'profile' from source: play vars 23826 1726867450.20231: variable 'interface' from source: set_fact 23826 1726867450.20293: variable 'interface' from source: set_fact 23826 1726867450.20324: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867450.20332: when evaluation is False, skipping this task 23826 1726867450.20369: _execute() done 23826 1726867450.20372: dumping result to json 23826 1726867450.20374: done dumping result, returning 23826 1726867450.20376: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000008b] 23826 1726867450.20387: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008b skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867450.20531: no more pending results, returning what we have 23826 1726867450.20534: results queue empty 23826 1726867450.20535: checking for any_errors_fatal 23826 1726867450.20540: done checking for any_errors_fatal 23826 1726867450.20541: checking for max_fail_percentage 23826 1726867450.20542: done checking for max_fail_percentage 23826 1726867450.20543: checking to see if all hosts have failed and the running result is not ok 23826 1726867450.20544: done checking to see if all hosts have failed 23826 1726867450.20545: getting the remaining hosts for this loop 23826 1726867450.20547: done getting the remaining hosts for this loop 23826 1726867450.20550: getting the next task for host managed_node2 23826 1726867450.20556: done getting next task for host managed_node2 23826 1726867450.20560: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 23826 1726867450.20562: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867450.20575: getting variables 23826 1726867450.20579: in VariableManager get_vars() 23826 1726867450.20622: Calling all_inventory to load vars for managed_node2 23826 1726867450.20625: Calling groups_inventory to load vars for managed_node2 23826 1726867450.20628: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867450.20639: Calling all_plugins_play to load vars for managed_node2 23826 1726867450.20642: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867450.20644: Calling groups_plugins_play to load vars for managed_node2 23826 1726867450.21842: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008b 23826 1726867450.21845: WORKER PROCESS EXITING 23826 1726867450.22476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867450.24076: done with get_vars() 23826 1726867450.24101: done getting variables 23826 1726867450.24164: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:24:10 -0400 (0:00:00.101) 0:00:32.253 ****** 23826 1726867450.24200: entering _queue_task() for managed_node2/package 23826 1726867450.24616: worker is 1 (out of 1 available) 23826 1726867450.24627: exiting _queue_task() for managed_node2/package 23826 1726867450.24638: done queuing things up, now waiting for results queue to drain 23826 1726867450.24639: waiting for pending results... 23826 1726867450.24843: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 23826 1726867450.24964: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000008c 23826 1726867450.24990: variable 'ansible_search_path' from source: unknown 23826 1726867450.24998: variable 'ansible_search_path' from source: unknown 23826 1726867450.25043: calling self._execute() 23826 1726867450.25150: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.25163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.25181: variable 'omit' from source: magic vars 23826 1726867450.25553: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.25569: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867450.25766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867450.26045: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867450.26102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867450.26143: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867450.26232: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867450.26347: variable 'network_packages' from source: role '' defaults 23826 1726867450.26463: variable '__network_provider_setup' from source: role '' defaults 23826 1726867450.26482: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867450.26559: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867450.26573: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867450.26645: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867450.26844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867450.28891: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867450.28954: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867450.28998: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867450.29035: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867450.29076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867450.29366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.29475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.29482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.29493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.29684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.29687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.29690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.29812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.29857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.29881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.30318: variable '__network_packages_default_gobject_packages' from source: role '' defaults 23826 1726867450.30596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.30627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.30651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.30689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.30720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.30813: variable 'ansible_python' from source: facts 23826 1726867450.30851: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 23826 1726867450.30948: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867450.31028: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867450.31259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.31262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.31264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.31266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.31268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.31315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.31348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.31382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.31428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.31446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.31601: variable 'network_connections' from source: play vars 23826 1726867450.31615: variable 'profile' from source: play vars 23826 1726867450.31723: variable 'profile' from source: play vars 23826 1726867450.31735: variable 'interface' from source: set_fact 23826 1726867450.31815: variable 'interface' from source: set_fact 23826 1726867450.31888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867450.31927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867450.31962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.32000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867450.32084: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867450.32425: variable 'network_connections' from source: play vars 23826 1726867450.32435: variable 'profile' from source: play vars 23826 1726867450.32544: variable 'profile' from source: play vars 23826 1726867450.32562: variable 'interface' from source: set_fact 23826 1726867450.32671: variable 'interface' from source: set_fact 23826 1726867450.32674: variable '__network_packages_default_wireless' from source: role '' defaults 23826 1726867450.32754: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867450.33081: variable 'network_connections' from source: play vars 23826 1726867450.33092: variable 'profile' from source: play vars 23826 1726867450.33153: variable 'profile' from source: play vars 23826 1726867450.33161: variable 'interface' from source: set_fact 23826 1726867450.33267: variable 'interface' from source: set_fact 23826 1726867450.33317: variable '__network_packages_default_team' from source: role '' defaults 23826 1726867450.33587: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867450.33905: variable 'network_connections' from source: play vars 23826 1726867450.33983: variable 'profile' from source: play vars 23826 1726867450.33990: variable 'profile' from source: play vars 23826 1726867450.33999: variable 'interface' from source: set_fact 23826 1726867450.34110: variable 'interface' from source: set_fact 23826 1726867450.34168: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867450.34237: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867450.34253: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867450.34319: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867450.34757: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 23826 1726867450.35667: variable 'network_connections' from source: play vars 23826 1726867450.35741: variable 'profile' from source: play vars 23826 1726867450.35859: variable 'profile' from source: play vars 23826 1726867450.35867: variable 'interface' from source: set_fact 23826 1726867450.36059: variable 'interface' from source: set_fact 23826 1726867450.36063: variable 'ansible_distribution' from source: facts 23826 1726867450.36069: variable '__network_rh_distros' from source: role '' defaults 23826 1726867450.36084: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.36104: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 23826 1726867450.36433: variable 'ansible_distribution' from source: facts 23826 1726867450.36494: variable '__network_rh_distros' from source: role '' defaults 23826 1726867450.36506: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.36580: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 23826 1726867450.36927: variable 'ansible_distribution' from source: facts 23826 1726867450.37015: variable '__network_rh_distros' from source: role '' defaults 23826 1726867450.37027: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.37071: variable 'network_provider' from source: set_fact 23826 1726867450.37113: variable 'ansible_facts' from source: unknown 23826 1726867450.38975: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 23826 1726867450.38988: when evaluation is False, skipping this task 23826 1726867450.38999: _execute() done 23826 1726867450.39005: dumping result to json 23826 1726867450.39016: done dumping result, returning 23826 1726867450.39029: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-a92d-a3ea-00000000008c] 23826 1726867450.39039: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008c 23826 1726867450.39329: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008c 23826 1726867450.39332: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 23826 1726867450.39383: no more pending results, returning what we have 23826 1726867450.39386: results queue empty 23826 1726867450.39387: checking for any_errors_fatal 23826 1726867450.39396: done checking for any_errors_fatal 23826 1726867450.39397: checking for max_fail_percentage 23826 1726867450.39399: done checking for max_fail_percentage 23826 1726867450.39400: checking to see if all hosts have failed and the running result is not ok 23826 1726867450.39401: done checking to see if all hosts have failed 23826 1726867450.39401: getting the remaining hosts for this loop 23826 1726867450.39403: done getting the remaining hosts for this loop 23826 1726867450.39410: getting the next task for host managed_node2 23826 1726867450.39416: done getting next task for host managed_node2 23826 1726867450.39420: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 23826 1726867450.39422: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867450.39436: getting variables 23826 1726867450.39438: in VariableManager get_vars() 23826 1726867450.39475: Calling all_inventory to load vars for managed_node2 23826 1726867450.39480: Calling groups_inventory to load vars for managed_node2 23826 1726867450.39483: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867450.39496: Calling all_plugins_play to load vars for managed_node2 23826 1726867450.39499: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867450.39501: Calling groups_plugins_play to load vars for managed_node2 23826 1726867450.42074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867450.44076: done with get_vars() 23826 1726867450.44104: done getting variables 23826 1726867450.44169: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:24:10 -0400 (0:00:00.200) 0:00:32.453 ****** 23826 1726867450.44205: entering _queue_task() for managed_node2/package 23826 1726867450.44565: worker is 1 (out of 1 available) 23826 1726867450.44576: exiting _queue_task() for managed_node2/package 23826 1726867450.44791: done queuing things up, now waiting for results queue to drain 23826 1726867450.44792: waiting for pending results... 23826 1726867450.44880: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 23826 1726867450.45001: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000008d 23826 1726867450.45029: variable 'ansible_search_path' from source: unknown 23826 1726867450.45037: variable 'ansible_search_path' from source: unknown 23826 1726867450.45075: calling self._execute() 23826 1726867450.45184: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.45198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.45216: variable 'omit' from source: magic vars 23826 1726867450.45669: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.45672: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867450.45748: variable 'network_state' from source: role '' defaults 23826 1726867450.45764: Evaluated conditional (network_state != {}): False 23826 1726867450.45774: when evaluation is False, skipping this task 23826 1726867450.45786: _execute() done 23826 1726867450.45794: dumping result to json 23826 1726867450.45802: done dumping result, returning 23826 1726867450.45817: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-a92d-a3ea-00000000008d] 23826 1726867450.45827: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867450.46073: no more pending results, returning what we have 23826 1726867450.46078: results queue empty 23826 1726867450.46080: checking for any_errors_fatal 23826 1726867450.46086: done checking for any_errors_fatal 23826 1726867450.46087: checking for max_fail_percentage 23826 1726867450.46089: done checking for max_fail_percentage 23826 1726867450.46090: checking to see if all hosts have failed and the running result is not ok 23826 1726867450.46091: done checking to see if all hosts have failed 23826 1726867450.46092: getting the remaining hosts for this loop 23826 1726867450.46093: done getting the remaining hosts for this loop 23826 1726867450.46097: getting the next task for host managed_node2 23826 1726867450.46103: done getting next task for host managed_node2 23826 1726867450.46110: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 23826 1726867450.46112: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867450.46129: getting variables 23826 1726867450.46130: in VariableManager get_vars() 23826 1726867450.46166: Calling all_inventory to load vars for managed_node2 23826 1726867450.46169: Calling groups_inventory to load vars for managed_node2 23826 1726867450.46172: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867450.46185: Calling all_plugins_play to load vars for managed_node2 23826 1726867450.46189: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867450.46192: Calling groups_plugins_play to load vars for managed_node2 23826 1726867450.46791: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008d 23826 1726867450.46794: WORKER PROCESS EXITING 23826 1726867450.47730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867450.49546: done with get_vars() 23826 1726867450.49571: done getting variables 23826 1726867450.49633: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:24:10 -0400 (0:00:00.054) 0:00:32.507 ****** 23826 1726867450.49665: entering _queue_task() for managed_node2/package 23826 1726867450.50437: worker is 1 (out of 1 available) 23826 1726867450.50450: exiting _queue_task() for managed_node2/package 23826 1726867450.50464: done queuing things up, now waiting for results queue to drain 23826 1726867450.50465: waiting for pending results... 23826 1726867450.51121: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 23826 1726867450.51315: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000008e 23826 1726867450.51335: variable 'ansible_search_path' from source: unknown 23826 1726867450.51339: variable 'ansible_search_path' from source: unknown 23826 1726867450.51376: calling self._execute() 23826 1726867450.51692: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.51699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.51713: variable 'omit' from source: magic vars 23826 1726867450.52468: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.52481: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867450.52773: variable 'network_state' from source: role '' defaults 23826 1726867450.52788: Evaluated conditional (network_state != {}): False 23826 1726867450.52791: when evaluation is False, skipping this task 23826 1726867450.52794: _execute() done 23826 1726867450.52797: dumping result to json 23826 1726867450.52887: done dumping result, returning 23826 1726867450.52896: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-a92d-a3ea-00000000008e] 23826 1726867450.52901: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008e 23826 1726867450.53064: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008e 23826 1726867450.53067: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867450.53132: no more pending results, returning what we have 23826 1726867450.53136: results queue empty 23826 1726867450.53137: checking for any_errors_fatal 23826 1726867450.53144: done checking for any_errors_fatal 23826 1726867450.53145: checking for max_fail_percentage 23826 1726867450.53147: done checking for max_fail_percentage 23826 1726867450.53147: checking to see if all hosts have failed and the running result is not ok 23826 1726867450.53148: done checking to see if all hosts have failed 23826 1726867450.53149: getting the remaining hosts for this loop 23826 1726867450.53151: done getting the remaining hosts for this loop 23826 1726867450.53154: getting the next task for host managed_node2 23826 1726867450.53160: done getting next task for host managed_node2 23826 1726867450.53164: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 23826 1726867450.53166: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867450.53181: getting variables 23826 1726867450.53183: in VariableManager get_vars() 23826 1726867450.53224: Calling all_inventory to load vars for managed_node2 23826 1726867450.53227: Calling groups_inventory to load vars for managed_node2 23826 1726867450.53229: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867450.53241: Calling all_plugins_play to load vars for managed_node2 23826 1726867450.53245: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867450.53248: Calling groups_plugins_play to load vars for managed_node2 23826 1726867450.56598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867450.58389: done with get_vars() 23826 1726867450.58415: done getting variables 23826 1726867450.58479: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:24:10 -0400 (0:00:00.088) 0:00:32.596 ****** 23826 1726867450.58513: entering _queue_task() for managed_node2/service 23826 1726867450.59784: worker is 1 (out of 1 available) 23826 1726867450.59796: exiting _queue_task() for managed_node2/service 23826 1726867450.59811: done queuing things up, now waiting for results queue to drain 23826 1726867450.59813: waiting for pending results... 23826 1726867450.60527: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 23826 1726867450.60620: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000008f 23826 1726867450.60624: variable 'ansible_search_path' from source: unknown 23826 1726867450.60626: variable 'ansible_search_path' from source: unknown 23826 1726867450.60726: calling self._execute() 23826 1726867450.61224: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.61227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.61231: variable 'omit' from source: magic vars 23826 1726867450.61885: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.61905: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867450.62079: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867450.62576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867450.66970: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867450.67161: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867450.67413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867450.67417: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867450.67419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867450.67684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.67687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.67701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.67952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.67955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.67957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.67959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.67983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.68103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.68127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.68215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.68310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.68340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.68429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.68505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.68697: variable 'network_connections' from source: play vars 23826 1726867450.68721: variable 'profile' from source: play vars 23826 1726867450.68916: variable 'profile' from source: play vars 23826 1726867450.68965: variable 'interface' from source: set_fact 23826 1726867450.69162: variable 'interface' from source: set_fact 23826 1726867450.69244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867450.69429: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867450.69476: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867450.69581: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867450.69585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867450.69604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867450.69634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867450.69665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.69702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867450.69756: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867450.70017: variable 'network_connections' from source: play vars 23826 1726867450.70028: variable 'profile' from source: play vars 23826 1726867450.70093: variable 'profile' from source: play vars 23826 1726867450.70102: variable 'interface' from source: set_fact 23826 1726867450.70173: variable 'interface' from source: set_fact 23826 1726867450.70229: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 23826 1726867450.70232: when evaluation is False, skipping this task 23826 1726867450.70234: _execute() done 23826 1726867450.70236: dumping result to json 23826 1726867450.70238: done dumping result, returning 23826 1726867450.70240: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-a92d-a3ea-00000000008f] 23826 1726867450.70283: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008f skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 23826 1726867450.70532: no more pending results, returning what we have 23826 1726867450.70536: results queue empty 23826 1726867450.70537: checking for any_errors_fatal 23826 1726867450.70544: done checking for any_errors_fatal 23826 1726867450.70545: checking for max_fail_percentage 23826 1726867450.70548: done checking for max_fail_percentage 23826 1726867450.70549: checking to see if all hosts have failed and the running result is not ok 23826 1726867450.70550: done checking to see if all hosts have failed 23826 1726867450.70551: getting the remaining hosts for this loop 23826 1726867450.70552: done getting the remaining hosts for this loop 23826 1726867450.70556: getting the next task for host managed_node2 23826 1726867450.70563: done getting next task for host managed_node2 23826 1726867450.70566: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 23826 1726867450.70569: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867450.70585: getting variables 23826 1726867450.70587: in VariableManager get_vars() 23826 1726867450.70634: Calling all_inventory to load vars for managed_node2 23826 1726867450.70637: Calling groups_inventory to load vars for managed_node2 23826 1726867450.70640: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867450.70651: Calling all_plugins_play to load vars for managed_node2 23826 1726867450.70654: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867450.70657: Calling groups_plugins_play to load vars for managed_node2 23826 1726867450.71191: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000008f 23826 1726867450.71194: WORKER PROCESS EXITING 23826 1726867450.72305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867450.73876: done with get_vars() 23826 1726867450.73902: done getting variables 23826 1726867450.73962: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:24:10 -0400 (0:00:00.154) 0:00:32.751 ****** 23826 1726867450.73993: entering _queue_task() for managed_node2/service 23826 1726867450.74390: worker is 1 (out of 1 available) 23826 1726867450.74400: exiting _queue_task() for managed_node2/service 23826 1726867450.74412: done queuing things up, now waiting for results queue to drain 23826 1726867450.74413: waiting for pending results... 23826 1726867450.74750: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 23826 1726867450.75206: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000090 23826 1726867450.75212: variable 'ansible_search_path' from source: unknown 23826 1726867450.75215: variable 'ansible_search_path' from source: unknown 23826 1726867450.75217: calling self._execute() 23826 1726867450.75400: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.75418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.75484: variable 'omit' from source: magic vars 23826 1726867450.75938: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.75954: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867450.76127: variable 'network_provider' from source: set_fact 23826 1726867450.76136: variable 'network_state' from source: role '' defaults 23826 1726867450.76148: Evaluated conditional (network_provider == "nm" or network_state != {}): True 23826 1726867450.76156: variable 'omit' from source: magic vars 23826 1726867450.76191: variable 'omit' from source: magic vars 23826 1726867450.76230: variable 'network_service_name' from source: role '' defaults 23826 1726867450.76304: variable 'network_service_name' from source: role '' defaults 23826 1726867450.76424: variable '__network_provider_setup' from source: role '' defaults 23826 1726867450.76482: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867450.76494: variable '__network_service_name_default_nm' from source: role '' defaults 23826 1726867450.76505: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867450.76575: variable '__network_packages_default_nm' from source: role '' defaults 23826 1726867450.76800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867450.79890: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867450.79962: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867450.80023: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867450.80061: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867450.80116: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867450.80181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.80222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.80285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.80301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.80326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.80375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.80411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.80438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.80475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.80496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.80724: variable '__network_packages_default_gobject_packages' from source: role '' defaults 23826 1726867450.80860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.80950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.80953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.81041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.81067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.81158: variable 'ansible_python' from source: facts 23826 1726867450.81190: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 23826 1726867450.81280: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867450.81362: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867450.81582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.81586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.81588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.81603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.81626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.81679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867450.81722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867450.81752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.81798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867450.81827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867450.81972: variable 'network_connections' from source: play vars 23826 1726867450.81990: variable 'profile' from source: play vars 23826 1726867450.82072: variable 'profile' from source: play vars 23826 1726867450.82086: variable 'interface' from source: set_fact 23826 1726867450.82155: variable 'interface' from source: set_fact 23826 1726867450.82270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867450.82580: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867450.82583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867450.82586: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867450.82633: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867450.82704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867450.82742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867450.82781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867450.82825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867450.82879: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867450.83181: variable 'network_connections' from source: play vars 23826 1726867450.83194: variable 'profile' from source: play vars 23826 1726867450.83284: variable 'profile' from source: play vars 23826 1726867450.83296: variable 'interface' from source: set_fact 23826 1726867450.83364: variable 'interface' from source: set_fact 23826 1726867450.83409: variable '__network_packages_default_wireless' from source: role '' defaults 23826 1726867450.83496: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867450.83814: variable 'network_connections' from source: play vars 23826 1726867450.83881: variable 'profile' from source: play vars 23826 1726867450.83902: variable 'profile' from source: play vars 23826 1726867450.83918: variable 'interface' from source: set_fact 23826 1726867450.83994: variable 'interface' from source: set_fact 23826 1726867450.84028: variable '__network_packages_default_team' from source: role '' defaults 23826 1726867450.84119: variable '__network_team_connections_defined' from source: role '' defaults 23826 1726867450.84422: variable 'network_connections' from source: play vars 23826 1726867450.84433: variable 'profile' from source: play vars 23826 1726867450.84510: variable 'profile' from source: play vars 23826 1726867450.84585: variable 'interface' from source: set_fact 23826 1726867450.84604: variable 'interface' from source: set_fact 23826 1726867450.84665: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867450.84735: variable '__network_service_name_default_initscripts' from source: role '' defaults 23826 1726867450.84748: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867450.84820: variable '__network_packages_default_initscripts' from source: role '' defaults 23826 1726867450.85055: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 23826 1726867450.85784: variable 'network_connections' from source: play vars 23826 1726867450.85788: variable 'profile' from source: play vars 23826 1726867450.85790: variable 'profile' from source: play vars 23826 1726867450.85792: variable 'interface' from source: set_fact 23826 1726867450.85794: variable 'interface' from source: set_fact 23826 1726867450.85796: variable 'ansible_distribution' from source: facts 23826 1726867450.85798: variable '__network_rh_distros' from source: role '' defaults 23826 1726867450.85799: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.85801: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 23826 1726867450.85944: variable 'ansible_distribution' from source: facts 23826 1726867450.85953: variable '__network_rh_distros' from source: role '' defaults 23826 1726867450.85963: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.85984: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 23826 1726867450.86160: variable 'ansible_distribution' from source: facts 23826 1726867450.86169: variable '__network_rh_distros' from source: role '' defaults 23826 1726867450.86176: variable 'ansible_distribution_major_version' from source: facts 23826 1726867450.86219: variable 'network_provider' from source: set_fact 23826 1726867450.86251: variable 'omit' from source: magic vars 23826 1726867450.86283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867450.86316: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867450.86339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867450.86363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867450.86376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867450.86413: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867450.86421: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.86428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.86536: Set connection var ansible_timeout to 10 23826 1726867450.86548: Set connection var ansible_shell_executable to /bin/sh 23826 1726867450.86554: Set connection var ansible_connection to ssh 23826 1726867450.86567: Set connection var ansible_pipelining to False 23826 1726867450.86573: Set connection var ansible_shell_type to sh 23826 1726867450.86583: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867450.86612: variable 'ansible_shell_executable' from source: unknown 23826 1726867450.86620: variable 'ansible_connection' from source: unknown 23826 1726867450.86627: variable 'ansible_module_compression' from source: unknown 23826 1726867450.86633: variable 'ansible_shell_type' from source: unknown 23826 1726867450.86639: variable 'ansible_shell_executable' from source: unknown 23826 1726867450.86645: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867450.86675: variable 'ansible_pipelining' from source: unknown 23826 1726867450.86679: variable 'ansible_timeout' from source: unknown 23826 1726867450.86681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867450.86781: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867450.86883: variable 'omit' from source: magic vars 23826 1726867450.86886: starting attempt loop 23826 1726867450.86888: running the handler 23826 1726867450.86982: variable 'ansible_facts' from source: unknown 23826 1726867450.87668: _low_level_execute_command(): starting 23826 1726867450.87684: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867450.88384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867450.88402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867450.88425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867450.88533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867450.88698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867450.88762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867450.90856: stdout chunk (state=3): >>>/root <<< 23826 1726867450.90921: stdout chunk (state=3): >>><<< 23826 1726867450.90924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867450.90927: stderr chunk (state=3): >>><<< 23826 1726867450.90948: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867450.91167: _low_level_execute_command(): starting 23826 1726867450.91171: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782 `" && echo ansible-tmp-1726867450.9108412-25446-96960541064782="` echo /root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782 `" ) && sleep 0' 23826 1726867450.92348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867450.92352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867450.92355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867450.92359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867450.92361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867450.92494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867450.92497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867450.92552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867450.94546: stdout chunk (state=3): >>>ansible-tmp-1726867450.9108412-25446-96960541064782=/root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782 <<< 23826 1726867450.94652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867450.94690: stderr chunk (state=3): >>><<< 23826 1726867450.94862: stdout chunk (state=3): >>><<< 23826 1726867450.94865: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867450.9108412-25446-96960541064782=/root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867450.94868: variable 'ansible_module_compression' from source: unknown 23826 1726867450.95184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 23826 1726867450.95186: variable 'ansible_facts' from source: unknown 23826 1726867450.95586: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/AnsiballZ_systemd.py 23826 1726867450.95817: Sending initial data 23826 1726867450.95827: Sent initial data (155 bytes) 23826 1726867450.97033: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867450.97093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867450.97272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867450.97321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867450.97490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867450.99094: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 23826 1726867450.99097: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867450.99126: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867450.99195: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/AnsiballZ_systemd.py" <<< 23826 1726867450.99198: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpf0pod9lp /root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/AnsiballZ_systemd.py <<< 23826 1726867450.99284: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpf0pod9lp" to remote "/root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/AnsiballZ_systemd.py" <<< 23826 1726867450.99287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/AnsiballZ_systemd.py" <<< 23826 1726867451.01957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867451.02083: stderr chunk (state=3): >>><<< 23826 1726867451.02087: stdout chunk (state=3): >>><<< 23826 1726867451.02089: done transferring module to remote 23826 1726867451.02091: _low_level_execute_command(): starting 23826 1726867451.02094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/ /root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/AnsiballZ_systemd.py && sleep 0' 23826 1726867451.02851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.02992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867451.03001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867451.03005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867451.03036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867451.05017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867451.05030: stderr chunk (state=3): >>><<< 23826 1726867451.05039: stdout chunk (state=3): >>><<< 23826 1726867451.05060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867451.05069: _low_level_execute_command(): starting 23826 1726867451.05085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/AnsiballZ_systemd.py && sleep 0' 23826 1726867451.05700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867451.05717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867451.05735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867451.05792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.05860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867451.05882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867451.05905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867451.06011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867451.35541: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4546560", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314860032", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1102935000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 23826 1726867451.35570: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus<<< 23826 1726867451.35580: stdout chunk (state=3): >>>-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 23826 1726867451.37567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867451.37574: stderr chunk (state=3): >>><<< 23826 1726867451.37576: stdout chunk (state=3): >>><<< 23826 1726867451.37606: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4546560", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314860032", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1102935000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867451.37734: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867451.37749: _low_level_execute_command(): starting 23826 1726867451.37754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867450.9108412-25446-96960541064782/ > /dev/null 2>&1 && sleep 0' 23826 1726867451.38164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867451.38167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.38169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867451.38171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867451.38173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.38228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867451.38231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867451.38268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867451.40114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867451.40127: stderr chunk (state=3): >>><<< 23826 1726867451.40130: stdout chunk (state=3): >>><<< 23826 1726867451.40141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867451.40148: handler run complete 23826 1726867451.40188: attempt loop complete, returning result 23826 1726867451.40191: _execute() done 23826 1726867451.40194: dumping result to json 23826 1726867451.40209: done dumping result, returning 23826 1726867451.40219: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-a92d-a3ea-000000000090] 23826 1726867451.40227: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000090 23826 1726867451.40460: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000090 23826 1726867451.40462: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867451.40510: no more pending results, returning what we have 23826 1726867451.40513: results queue empty 23826 1726867451.40514: checking for any_errors_fatal 23826 1726867451.40520: done checking for any_errors_fatal 23826 1726867451.40520: checking for max_fail_percentage 23826 1726867451.40522: done checking for max_fail_percentage 23826 1726867451.40523: checking to see if all hosts have failed and the running result is not ok 23826 1726867451.40523: done checking to see if all hosts have failed 23826 1726867451.40524: getting the remaining hosts for this loop 23826 1726867451.40525: done getting the remaining hosts for this loop 23826 1726867451.40529: getting the next task for host managed_node2 23826 1726867451.40534: done getting next task for host managed_node2 23826 1726867451.40536: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 23826 1726867451.40538: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867451.40547: getting variables 23826 1726867451.40548: in VariableManager get_vars() 23826 1726867451.40582: Calling all_inventory to load vars for managed_node2 23826 1726867451.40584: Calling groups_inventory to load vars for managed_node2 23826 1726867451.40586: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867451.40595: Calling all_plugins_play to load vars for managed_node2 23826 1726867451.40597: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867451.40599: Calling groups_plugins_play to load vars for managed_node2 23826 1726867451.41493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867451.42360: done with get_vars() 23826 1726867451.42375: done getting variables 23826 1726867451.42423: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:24:11 -0400 (0:00:00.684) 0:00:33.435 ****** 23826 1726867451.42445: entering _queue_task() for managed_node2/service 23826 1726867451.42666: worker is 1 (out of 1 available) 23826 1726867451.42679: exiting _queue_task() for managed_node2/service 23826 1726867451.42690: done queuing things up, now waiting for results queue to drain 23826 1726867451.42691: waiting for pending results... 23826 1726867451.42871: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 23826 1726867451.42947: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000091 23826 1726867451.42958: variable 'ansible_search_path' from source: unknown 23826 1726867451.42962: variable 'ansible_search_path' from source: unknown 23826 1726867451.42991: calling self._execute() 23826 1726867451.43064: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867451.43068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867451.43078: variable 'omit' from source: magic vars 23826 1726867451.43346: variable 'ansible_distribution_major_version' from source: facts 23826 1726867451.43357: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867451.43441: variable 'network_provider' from source: set_fact 23826 1726867451.43445: Evaluated conditional (network_provider == "nm"): True 23826 1726867451.43517: variable '__network_wpa_supplicant_required' from source: role '' defaults 23826 1726867451.43576: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 23826 1726867451.43700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867451.45121: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867451.45165: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867451.45193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867451.45224: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867451.45243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867451.45314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867451.45336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867451.45353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867451.45381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867451.45393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867451.45432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867451.45449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867451.45465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867451.45491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867451.45502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867451.45538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867451.45553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867451.45570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867451.45595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867451.45606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867451.45703: variable 'network_connections' from source: play vars 23826 1726867451.45716: variable 'profile' from source: play vars 23826 1726867451.45766: variable 'profile' from source: play vars 23826 1726867451.45769: variable 'interface' from source: set_fact 23826 1726867451.45814: variable 'interface' from source: set_fact 23826 1726867451.45865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 23826 1726867451.45973: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 23826 1726867451.46002: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 23826 1726867451.46028: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 23826 1726867451.46049: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 23826 1726867451.46083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 23826 1726867451.46099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 23826 1726867451.46119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867451.46135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 23826 1726867451.46171: variable '__network_wireless_connections_defined' from source: role '' defaults 23826 1726867451.46332: variable 'network_connections' from source: play vars 23826 1726867451.46335: variable 'profile' from source: play vars 23826 1726867451.46378: variable 'profile' from source: play vars 23826 1726867451.46382: variable 'interface' from source: set_fact 23826 1726867451.46427: variable 'interface' from source: set_fact 23826 1726867451.46448: Evaluated conditional (__network_wpa_supplicant_required): False 23826 1726867451.46452: when evaluation is False, skipping this task 23826 1726867451.46455: _execute() done 23826 1726867451.46464: dumping result to json 23826 1726867451.46466: done dumping result, returning 23826 1726867451.46469: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-a92d-a3ea-000000000091] 23826 1726867451.46471: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000091 23826 1726867451.46549: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000091 23826 1726867451.46552: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 23826 1726867451.46601: no more pending results, returning what we have 23826 1726867451.46605: results queue empty 23826 1726867451.46606: checking for any_errors_fatal 23826 1726867451.46625: done checking for any_errors_fatal 23826 1726867451.46625: checking for max_fail_percentage 23826 1726867451.46627: done checking for max_fail_percentage 23826 1726867451.46628: checking to see if all hosts have failed and the running result is not ok 23826 1726867451.46629: done checking to see if all hosts have failed 23826 1726867451.46630: getting the remaining hosts for this loop 23826 1726867451.46631: done getting the remaining hosts for this loop 23826 1726867451.46634: getting the next task for host managed_node2 23826 1726867451.46639: done getting next task for host managed_node2 23826 1726867451.46642: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 23826 1726867451.46644: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867451.46656: getting variables 23826 1726867451.46657: in VariableManager get_vars() 23826 1726867451.46699: Calling all_inventory to load vars for managed_node2 23826 1726867451.46702: Calling groups_inventory to load vars for managed_node2 23826 1726867451.46704: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867451.46712: Calling all_plugins_play to load vars for managed_node2 23826 1726867451.46715: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867451.46717: Calling groups_plugins_play to load vars for managed_node2 23826 1726867451.47491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867451.48454: done with get_vars() 23826 1726867451.48469: done getting variables 23826 1726867451.48510: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:24:11 -0400 (0:00:00.060) 0:00:33.496 ****** 23826 1726867451.48532: entering _queue_task() for managed_node2/service 23826 1726867451.48734: worker is 1 (out of 1 available) 23826 1726867451.48746: exiting _queue_task() for managed_node2/service 23826 1726867451.48758: done queuing things up, now waiting for results queue to drain 23826 1726867451.48760: waiting for pending results... 23826 1726867451.48932: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 23826 1726867451.49002: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000092 23826 1726867451.49015: variable 'ansible_search_path' from source: unknown 23826 1726867451.49018: variable 'ansible_search_path' from source: unknown 23826 1726867451.49045: calling self._execute() 23826 1726867451.49117: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867451.49122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867451.49130: variable 'omit' from source: magic vars 23826 1726867451.49386: variable 'ansible_distribution_major_version' from source: facts 23826 1726867451.49395: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867451.49479: variable 'network_provider' from source: set_fact 23826 1726867451.49483: Evaluated conditional (network_provider == "initscripts"): False 23826 1726867451.49485: when evaluation is False, skipping this task 23826 1726867451.49488: _execute() done 23826 1726867451.49490: dumping result to json 23826 1726867451.49492: done dumping result, returning 23826 1726867451.49499: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-a92d-a3ea-000000000092] 23826 1726867451.49504: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000092 23826 1726867451.49589: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000092 23826 1726867451.49591: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 23826 1726867451.49666: no more pending results, returning what we have 23826 1726867451.49669: results queue empty 23826 1726867451.49669: checking for any_errors_fatal 23826 1726867451.49675: done checking for any_errors_fatal 23826 1726867451.49676: checking for max_fail_percentage 23826 1726867451.49682: done checking for max_fail_percentage 23826 1726867451.49683: checking to see if all hosts have failed and the running result is not ok 23826 1726867451.49684: done checking to see if all hosts have failed 23826 1726867451.49685: getting the remaining hosts for this loop 23826 1726867451.49686: done getting the remaining hosts for this loop 23826 1726867451.49689: getting the next task for host managed_node2 23826 1726867451.49693: done getting next task for host managed_node2 23826 1726867451.49696: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 23826 1726867451.49698: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867451.49712: getting variables 23826 1726867451.49713: in VariableManager get_vars() 23826 1726867451.49741: Calling all_inventory to load vars for managed_node2 23826 1726867451.49743: Calling groups_inventory to load vars for managed_node2 23826 1726867451.49744: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867451.49750: Calling all_plugins_play to load vars for managed_node2 23826 1726867451.49752: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867451.49753: Calling groups_plugins_play to load vars for managed_node2 23826 1726867451.50482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867451.51343: done with get_vars() 23826 1726867451.51358: done getting variables 23826 1726867451.51400: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:24:11 -0400 (0:00:00.028) 0:00:33.525 ****** 23826 1726867451.51423: entering _queue_task() for managed_node2/copy 23826 1726867451.51618: worker is 1 (out of 1 available) 23826 1726867451.51631: exiting _queue_task() for managed_node2/copy 23826 1726867451.51643: done queuing things up, now waiting for results queue to drain 23826 1726867451.51644: waiting for pending results... 23826 1726867451.51816: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 23826 1726867451.51888: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000093 23826 1726867451.51898: variable 'ansible_search_path' from source: unknown 23826 1726867451.51901: variable 'ansible_search_path' from source: unknown 23826 1726867451.51932: calling self._execute() 23826 1726867451.52002: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867451.52006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867451.52017: variable 'omit' from source: magic vars 23826 1726867451.52275: variable 'ansible_distribution_major_version' from source: facts 23826 1726867451.52285: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867451.52367: variable 'network_provider' from source: set_fact 23826 1726867451.52370: Evaluated conditional (network_provider == "initscripts"): False 23826 1726867451.52373: when evaluation is False, skipping this task 23826 1726867451.52376: _execute() done 23826 1726867451.52380: dumping result to json 23826 1726867451.52383: done dumping result, returning 23826 1726867451.52392: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-a92d-a3ea-000000000093] 23826 1726867451.52394: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000093 23826 1726867451.52486: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000093 23826 1726867451.52489: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 23826 1726867451.52554: no more pending results, returning what we have 23826 1726867451.52556: results queue empty 23826 1726867451.52557: checking for any_errors_fatal 23826 1726867451.52563: done checking for any_errors_fatal 23826 1726867451.52564: checking for max_fail_percentage 23826 1726867451.52565: done checking for max_fail_percentage 23826 1726867451.52566: checking to see if all hosts have failed and the running result is not ok 23826 1726867451.52567: done checking to see if all hosts have failed 23826 1726867451.52568: getting the remaining hosts for this loop 23826 1726867451.52569: done getting the remaining hosts for this loop 23826 1726867451.52571: getting the next task for host managed_node2 23826 1726867451.52575: done getting next task for host managed_node2 23826 1726867451.52580: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 23826 1726867451.52582: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867451.52594: getting variables 23826 1726867451.52595: in VariableManager get_vars() 23826 1726867451.52623: Calling all_inventory to load vars for managed_node2 23826 1726867451.52626: Calling groups_inventory to load vars for managed_node2 23826 1726867451.52628: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867451.52634: Calling all_plugins_play to load vars for managed_node2 23826 1726867451.52637: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867451.52640: Calling groups_plugins_play to load vars for managed_node2 23826 1726867451.53486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867451.54342: done with get_vars() 23826 1726867451.54356: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:24:11 -0400 (0:00:00.029) 0:00:33.555 ****** 23826 1726867451.54419: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 23826 1726867451.54623: worker is 1 (out of 1 available) 23826 1726867451.54635: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 23826 1726867451.54647: done queuing things up, now waiting for results queue to drain 23826 1726867451.54648: waiting for pending results... 23826 1726867451.54824: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 23826 1726867451.54883: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000094 23826 1726867451.54901: variable 'ansible_search_path' from source: unknown 23826 1726867451.54905: variable 'ansible_search_path' from source: unknown 23826 1726867451.54932: calling self._execute() 23826 1726867451.55010: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867451.55014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867451.55023: variable 'omit' from source: magic vars 23826 1726867451.55286: variable 'ansible_distribution_major_version' from source: facts 23826 1726867451.55295: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867451.55300: variable 'omit' from source: magic vars 23826 1726867451.55333: variable 'omit' from source: magic vars 23826 1726867451.55447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 23826 1726867451.56859: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 23826 1726867451.56903: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 23826 1726867451.56929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 23826 1726867451.56956: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 23826 1726867451.56979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 23826 1726867451.57034: variable 'network_provider' from source: set_fact 23826 1726867451.57123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 23826 1726867451.57155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 23826 1726867451.57174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 23826 1726867451.57202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 23826 1726867451.57213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 23826 1726867451.57266: variable 'omit' from source: magic vars 23826 1726867451.57342: variable 'omit' from source: magic vars 23826 1726867451.57484: variable 'network_connections' from source: play vars 23826 1726867451.57489: variable 'profile' from source: play vars 23826 1726867451.57492: variable 'profile' from source: play vars 23826 1726867451.57494: variable 'interface' from source: set_fact 23826 1726867451.57582: variable 'interface' from source: set_fact 23826 1726867451.57615: variable 'omit' from source: magic vars 23826 1726867451.57626: variable '__lsr_ansible_managed' from source: task vars 23826 1726867451.57668: variable '__lsr_ansible_managed' from source: task vars 23826 1726867451.57853: Loaded config def from plugin (lookup/template) 23826 1726867451.57857: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 23826 1726867451.57879: File lookup term: get_ansible_managed.j2 23826 1726867451.57883: variable 'ansible_search_path' from source: unknown 23826 1726867451.57887: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 23826 1726867451.57897: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 23826 1726867451.57910: variable 'ansible_search_path' from source: unknown 23826 1726867451.64925: variable 'ansible_managed' from source: unknown 23826 1726867451.65108: variable 'omit' from source: magic vars 23826 1726867451.65112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867451.65115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867451.65118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867451.65120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867451.65122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867451.65123: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867451.65126: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867451.65128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867451.65130: Set connection var ansible_timeout to 10 23826 1726867451.65136: Set connection var ansible_shell_executable to /bin/sh 23826 1726867451.65138: Set connection var ansible_connection to ssh 23826 1726867451.65145: Set connection var ansible_pipelining to False 23826 1726867451.65147: Set connection var ansible_shell_type to sh 23826 1726867451.65152: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867451.65169: variable 'ansible_shell_executable' from source: unknown 23826 1726867451.65171: variable 'ansible_connection' from source: unknown 23826 1726867451.65174: variable 'ansible_module_compression' from source: unknown 23826 1726867451.65176: variable 'ansible_shell_type' from source: unknown 23826 1726867451.65181: variable 'ansible_shell_executable' from source: unknown 23826 1726867451.65183: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867451.65185: variable 'ansible_pipelining' from source: unknown 23826 1726867451.65187: variable 'ansible_timeout' from source: unknown 23826 1726867451.65192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867451.65276: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867451.65289: variable 'omit' from source: magic vars 23826 1726867451.65292: starting attempt loop 23826 1726867451.65294: running the handler 23826 1726867451.65299: _low_level_execute_command(): starting 23826 1726867451.65304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867451.65787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867451.65791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.65794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867451.65796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.65845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867451.65848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867451.65850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867451.65908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867451.67597: stdout chunk (state=3): >>>/root <<< 23826 1726867451.67694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867451.67725: stderr chunk (state=3): >>><<< 23826 1726867451.67728: stdout chunk (state=3): >>><<< 23826 1726867451.67746: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867451.67755: _low_level_execute_command(): starting 23826 1726867451.67760: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521 `" && echo ansible-tmp-1726867451.6774569-25475-256199482359521="` echo /root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521 `" ) && sleep 0' 23826 1726867451.68184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867451.68187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867451.68190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867451.68192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.68238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867451.68241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867451.68288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867451.70312: stdout chunk (state=3): >>>ansible-tmp-1726867451.6774569-25475-256199482359521=/root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521 <<< 23826 1726867451.70388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867451.70391: stdout chunk (state=3): >>><<< 23826 1726867451.70393: stderr chunk (state=3): >>><<< 23826 1726867451.70409: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867451.6774569-25475-256199482359521=/root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867451.70493: variable 'ansible_module_compression' from source: unknown 23826 1726867451.70513: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 23826 1726867451.70564: variable 'ansible_facts' from source: unknown 23826 1726867451.70704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/AnsiballZ_network_connections.py 23826 1726867451.70921: Sending initial data 23826 1726867451.70925: Sent initial data (168 bytes) 23826 1726867451.71415: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867451.71427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867451.71531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867451.71559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867451.71636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867451.73224: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 23826 1726867451.73259: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867451.73301: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867451.73338: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp9ay0020e /root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/AnsiballZ_network_connections.py <<< 23826 1726867451.73345: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/AnsiballZ_network_connections.py" <<< 23826 1726867451.73381: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp9ay0020e" to remote "/root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/AnsiballZ_network_connections.py" <<< 23826 1726867451.73384: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/AnsiballZ_network_connections.py" <<< 23826 1726867451.74090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867451.74131: stderr chunk (state=3): >>><<< 23826 1726867451.74134: stdout chunk (state=3): >>><<< 23826 1726867451.74165: done transferring module to remote 23826 1726867451.74174: _low_level_execute_command(): starting 23826 1726867451.74180: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/ /root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/AnsiballZ_network_connections.py && sleep 0' 23826 1726867451.74784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867451.74787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867451.74790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867451.74792: stderr chunk (state=3): >>>debug2: match found <<< 23826 1726867451.74809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867451.74925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867451.74932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867451.76743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867451.76752: stdout chunk (state=3): >>><<< 23826 1726867451.76762: stderr chunk (state=3): >>><<< 23826 1726867451.76781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867451.76792: _low_level_execute_command(): starting 23826 1726867451.76801: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/AnsiballZ_network_connections.py && sleep 0' 23826 1726867451.77336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867451.77349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867451.77361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867451.77383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867451.77401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867451.77415: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867451.77430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.77444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867451.77457: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867451.77468: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867451.77489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867451.77505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867451.77521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867451.77533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867451.77593: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867451.77625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867451.77642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867451.77666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867451.77741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.05549: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_06h6vdh_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_06h6vdh_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/41871d48-5e02-497c-b448-55b0b16ff70d: error=unknown <<< 23826 1726867452.05687: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 23826 1726867452.07533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867452.07567: stderr chunk (state=3): >>><<< 23826 1726867452.07580: stdout chunk (state=3): >>><<< 23826 1726867452.07606: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_06h6vdh_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_06h6vdh_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/41871d48-5e02-497c-b448-55b0b16ff70d: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867452.07644: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867452.07658: _low_level_execute_command(): starting 23826 1726867452.07737: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867451.6774569-25475-256199482359521/ > /dev/null 2>&1 && sleep 0' 23826 1726867452.08268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867452.08290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867452.08315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.08353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.08357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867452.08367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.08413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.08426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.08506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.10313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.10336: stderr chunk (state=3): >>><<< 23826 1726867452.10339: stdout chunk (state=3): >>><<< 23826 1726867452.10353: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867452.10361: handler run complete 23826 1726867452.10381: attempt loop complete, returning result 23826 1726867452.10384: _execute() done 23826 1726867452.10386: dumping result to json 23826 1726867452.10389: done dumping result, returning 23826 1726867452.10399: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-a92d-a3ea-000000000094] 23826 1726867452.10401: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000094 23826 1726867452.10493: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000094 23826 1726867452.10496: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 23826 1726867452.10578: no more pending results, returning what we have 23826 1726867452.10582: results queue empty 23826 1726867452.10583: checking for any_errors_fatal 23826 1726867452.10588: done checking for any_errors_fatal 23826 1726867452.10589: checking for max_fail_percentage 23826 1726867452.10590: done checking for max_fail_percentage 23826 1726867452.10591: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.10592: done checking to see if all hosts have failed 23826 1726867452.10592: getting the remaining hosts for this loop 23826 1726867452.10594: done getting the remaining hosts for this loop 23826 1726867452.10597: getting the next task for host managed_node2 23826 1726867452.10602: done getting next task for host managed_node2 23826 1726867452.10605: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 23826 1726867452.10607: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.10618: getting variables 23826 1726867452.10619: in VariableManager get_vars() 23826 1726867452.10652: Calling all_inventory to load vars for managed_node2 23826 1726867452.10655: Calling groups_inventory to load vars for managed_node2 23826 1726867452.10657: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.10665: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.10668: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.10670: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.11963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.12840: done with get_vars() 23826 1726867452.12858: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:24:12 -0400 (0:00:00.585) 0:00:34.140 ****** 23826 1726867452.12922: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 23826 1726867452.13159: worker is 1 (out of 1 available) 23826 1726867452.13171: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 23826 1726867452.13184: done queuing things up, now waiting for results queue to drain 23826 1726867452.13186: waiting for pending results... 23826 1726867452.13366: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 23826 1726867452.13446: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000095 23826 1726867452.13458: variable 'ansible_search_path' from source: unknown 23826 1726867452.13461: variable 'ansible_search_path' from source: unknown 23826 1726867452.13491: calling self._execute() 23826 1726867452.13564: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.13569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.13578: variable 'omit' from source: magic vars 23826 1726867452.13864: variable 'ansible_distribution_major_version' from source: facts 23826 1726867452.13874: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867452.13962: variable 'network_state' from source: role '' defaults 23826 1726867452.13965: Evaluated conditional (network_state != {}): False 23826 1726867452.13968: when evaluation is False, skipping this task 23826 1726867452.13971: _execute() done 23826 1726867452.13974: dumping result to json 23826 1726867452.13980: done dumping result, returning 23826 1726867452.13987: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-a92d-a3ea-000000000095] 23826 1726867452.13991: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000095 23826 1726867452.14073: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000095 23826 1726867452.14076: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 23826 1726867452.14128: no more pending results, returning what we have 23826 1726867452.14131: results queue empty 23826 1726867452.14132: checking for any_errors_fatal 23826 1726867452.14146: done checking for any_errors_fatal 23826 1726867452.14146: checking for max_fail_percentage 23826 1726867452.14148: done checking for max_fail_percentage 23826 1726867452.14149: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.14150: done checking to see if all hosts have failed 23826 1726867452.14150: getting the remaining hosts for this loop 23826 1726867452.14152: done getting the remaining hosts for this loop 23826 1726867452.14155: getting the next task for host managed_node2 23826 1726867452.14161: done getting next task for host managed_node2 23826 1726867452.14164: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 23826 1726867452.14166: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.14181: getting variables 23826 1726867452.14183: in VariableManager get_vars() 23826 1726867452.14214: Calling all_inventory to load vars for managed_node2 23826 1726867452.14217: Calling groups_inventory to load vars for managed_node2 23826 1726867452.14219: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.14228: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.14230: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.14232: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.15105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.15954: done with get_vars() 23826 1726867452.15969: done getting variables 23826 1726867452.16010: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:24:12 -0400 (0:00:00.031) 0:00:34.171 ****** 23826 1726867452.16033: entering _queue_task() for managed_node2/debug 23826 1726867452.16230: worker is 1 (out of 1 available) 23826 1726867452.16242: exiting _queue_task() for managed_node2/debug 23826 1726867452.16254: done queuing things up, now waiting for results queue to drain 23826 1726867452.16255: waiting for pending results... 23826 1726867452.16428: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 23826 1726867452.16498: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000096 23826 1726867452.16509: variable 'ansible_search_path' from source: unknown 23826 1726867452.16514: variable 'ansible_search_path' from source: unknown 23826 1726867452.16542: calling self._execute() 23826 1726867452.16617: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.16621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.16630: variable 'omit' from source: magic vars 23826 1726867452.16887: variable 'ansible_distribution_major_version' from source: facts 23826 1726867452.16896: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867452.16901: variable 'omit' from source: magic vars 23826 1726867452.16935: variable 'omit' from source: magic vars 23826 1726867452.16960: variable 'omit' from source: magic vars 23826 1726867452.16993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867452.17025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867452.17040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867452.17053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867452.17063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867452.17085: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867452.17089: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.17092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.17162: Set connection var ansible_timeout to 10 23826 1726867452.17168: Set connection var ansible_shell_executable to /bin/sh 23826 1726867452.17171: Set connection var ansible_connection to ssh 23826 1726867452.17179: Set connection var ansible_pipelining to False 23826 1726867452.17182: Set connection var ansible_shell_type to sh 23826 1726867452.17187: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867452.17204: variable 'ansible_shell_executable' from source: unknown 23826 1726867452.17207: variable 'ansible_connection' from source: unknown 23826 1726867452.17210: variable 'ansible_module_compression' from source: unknown 23826 1726867452.17215: variable 'ansible_shell_type' from source: unknown 23826 1726867452.17217: variable 'ansible_shell_executable' from source: unknown 23826 1726867452.17219: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.17224: variable 'ansible_pipelining' from source: unknown 23826 1726867452.17226: variable 'ansible_timeout' from source: unknown 23826 1726867452.17230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.17330: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867452.17340: variable 'omit' from source: magic vars 23826 1726867452.17346: starting attempt loop 23826 1726867452.17348: running the handler 23826 1726867452.17439: variable '__network_connections_result' from source: set_fact 23826 1726867452.17483: handler run complete 23826 1726867452.17496: attempt loop complete, returning result 23826 1726867452.17499: _execute() done 23826 1726867452.17502: dumping result to json 23826 1726867452.17504: done dumping result, returning 23826 1726867452.17516: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-a92d-a3ea-000000000096] 23826 1726867452.17518: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000096 23826 1726867452.17594: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000096 23826 1726867452.17597: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 23826 1726867452.17650: no more pending results, returning what we have 23826 1726867452.17653: results queue empty 23826 1726867452.17654: checking for any_errors_fatal 23826 1726867452.17660: done checking for any_errors_fatal 23826 1726867452.17661: checking for max_fail_percentage 23826 1726867452.17662: done checking for max_fail_percentage 23826 1726867452.17663: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.17664: done checking to see if all hosts have failed 23826 1726867452.17664: getting the remaining hosts for this loop 23826 1726867452.17666: done getting the remaining hosts for this loop 23826 1726867452.17669: getting the next task for host managed_node2 23826 1726867452.17674: done getting next task for host managed_node2 23826 1726867452.17679: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 23826 1726867452.17681: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.17689: getting variables 23826 1726867452.17690: in VariableManager get_vars() 23826 1726867452.17717: Calling all_inventory to load vars for managed_node2 23826 1726867452.17720: Calling groups_inventory to load vars for managed_node2 23826 1726867452.17722: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.17729: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.17731: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.17733: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.18483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.19426: done with get_vars() 23826 1726867452.19440: done getting variables 23826 1726867452.19478: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:24:12 -0400 (0:00:00.034) 0:00:34.206 ****** 23826 1726867452.19499: entering _queue_task() for managed_node2/debug 23826 1726867452.19691: worker is 1 (out of 1 available) 23826 1726867452.19703: exiting _queue_task() for managed_node2/debug 23826 1726867452.19713: done queuing things up, now waiting for results queue to drain 23826 1726867452.19715: waiting for pending results... 23826 1726867452.19884: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 23826 1726867452.19948: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000097 23826 1726867452.19959: variable 'ansible_search_path' from source: unknown 23826 1726867452.19962: variable 'ansible_search_path' from source: unknown 23826 1726867452.19990: calling self._execute() 23826 1726867452.20058: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.20064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.20073: variable 'omit' from source: magic vars 23826 1726867452.20327: variable 'ansible_distribution_major_version' from source: facts 23826 1726867452.20336: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867452.20341: variable 'omit' from source: magic vars 23826 1726867452.20364: variable 'omit' from source: magic vars 23826 1726867452.20393: variable 'omit' from source: magic vars 23826 1726867452.20425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867452.20449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867452.20466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867452.20480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867452.20492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867452.20515: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867452.20519: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.20521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.20586: Set connection var ansible_timeout to 10 23826 1726867452.20599: Set connection var ansible_shell_executable to /bin/sh 23826 1726867452.20602: Set connection var ansible_connection to ssh 23826 1726867452.20604: Set connection var ansible_pipelining to False 23826 1726867452.20606: Set connection var ansible_shell_type to sh 23826 1726867452.20613: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867452.20630: variable 'ansible_shell_executable' from source: unknown 23826 1726867452.20633: variable 'ansible_connection' from source: unknown 23826 1726867452.20636: variable 'ansible_module_compression' from source: unknown 23826 1726867452.20638: variable 'ansible_shell_type' from source: unknown 23826 1726867452.20640: variable 'ansible_shell_executable' from source: unknown 23826 1726867452.20642: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.20645: variable 'ansible_pipelining' from source: unknown 23826 1726867452.20648: variable 'ansible_timeout' from source: unknown 23826 1726867452.20652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.20753: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867452.20764: variable 'omit' from source: magic vars 23826 1726867452.20770: starting attempt loop 23826 1726867452.20772: running the handler 23826 1726867452.20817: variable '__network_connections_result' from source: set_fact 23826 1726867452.20864: variable '__network_connections_result' from source: set_fact 23826 1726867452.20939: handler run complete 23826 1726867452.20954: attempt loop complete, returning result 23826 1726867452.20957: _execute() done 23826 1726867452.20960: dumping result to json 23826 1726867452.20964: done dumping result, returning 23826 1726867452.20971: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-a92d-a3ea-000000000097] 23826 1726867452.20974: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000097 23826 1726867452.21055: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000097 23826 1726867452.21058: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 23826 1726867452.21130: no more pending results, returning what we have 23826 1726867452.21132: results queue empty 23826 1726867452.21133: checking for any_errors_fatal 23826 1726867452.21137: done checking for any_errors_fatal 23826 1726867452.21138: checking for max_fail_percentage 23826 1726867452.21139: done checking for max_fail_percentage 23826 1726867452.21140: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.21141: done checking to see if all hosts have failed 23826 1726867452.21142: getting the remaining hosts for this loop 23826 1726867452.21143: done getting the remaining hosts for this loop 23826 1726867452.21145: getting the next task for host managed_node2 23826 1726867452.21150: done getting next task for host managed_node2 23826 1726867452.21153: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 23826 1726867452.21155: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.21163: getting variables 23826 1726867452.21164: in VariableManager get_vars() 23826 1726867452.21199: Calling all_inventory to load vars for managed_node2 23826 1726867452.21202: Calling groups_inventory to load vars for managed_node2 23826 1726867452.21204: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.21211: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.21214: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.21216: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.24970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.25820: done with get_vars() 23826 1726867452.25835: done getting variables 23826 1726867452.25869: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:24:12 -0400 (0:00:00.063) 0:00:34.270 ****** 23826 1726867452.25892: entering _queue_task() for managed_node2/debug 23826 1726867452.26126: worker is 1 (out of 1 available) 23826 1726867452.26138: exiting _queue_task() for managed_node2/debug 23826 1726867452.26150: done queuing things up, now waiting for results queue to drain 23826 1726867452.26151: waiting for pending results... 23826 1726867452.26335: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 23826 1726867452.26411: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000098 23826 1726867452.26425: variable 'ansible_search_path' from source: unknown 23826 1726867452.26428: variable 'ansible_search_path' from source: unknown 23826 1726867452.26457: calling self._execute() 23826 1726867452.26536: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.26543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.26551: variable 'omit' from source: magic vars 23826 1726867452.26828: variable 'ansible_distribution_major_version' from source: facts 23826 1726867452.26838: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867452.26925: variable 'network_state' from source: role '' defaults 23826 1726867452.26936: Evaluated conditional (network_state != {}): False 23826 1726867452.26939: when evaluation is False, skipping this task 23826 1726867452.26942: _execute() done 23826 1726867452.26945: dumping result to json 23826 1726867452.26953: done dumping result, returning 23826 1726867452.26957: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-a92d-a3ea-000000000098] 23826 1726867452.26963: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000098 23826 1726867452.27048: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000098 23826 1726867452.27051: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 23826 1726867452.27096: no more pending results, returning what we have 23826 1726867452.27100: results queue empty 23826 1726867452.27101: checking for any_errors_fatal 23826 1726867452.27114: done checking for any_errors_fatal 23826 1726867452.27115: checking for max_fail_percentage 23826 1726867452.27117: done checking for max_fail_percentage 23826 1726867452.27118: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.27119: done checking to see if all hosts have failed 23826 1726867452.27119: getting the remaining hosts for this loop 23826 1726867452.27121: done getting the remaining hosts for this loop 23826 1726867452.27124: getting the next task for host managed_node2 23826 1726867452.27130: done getting next task for host managed_node2 23826 1726867452.27133: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 23826 1726867452.27136: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.27148: getting variables 23826 1726867452.27150: in VariableManager get_vars() 23826 1726867452.27180: Calling all_inventory to load vars for managed_node2 23826 1726867452.27183: Calling groups_inventory to load vars for managed_node2 23826 1726867452.27185: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.27193: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.27195: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.27197: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.28006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.28903: done with get_vars() 23826 1726867452.28919: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:24:12 -0400 (0:00:00.030) 0:00:34.301 ****** 23826 1726867452.28981: entering _queue_task() for managed_node2/ping 23826 1726867452.29175: worker is 1 (out of 1 available) 23826 1726867452.29190: exiting _queue_task() for managed_node2/ping 23826 1726867452.29200: done queuing things up, now waiting for results queue to drain 23826 1726867452.29201: waiting for pending results... 23826 1726867452.29359: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 23826 1726867452.29428: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000099 23826 1726867452.29442: variable 'ansible_search_path' from source: unknown 23826 1726867452.29446: variable 'ansible_search_path' from source: unknown 23826 1726867452.29471: calling self._execute() 23826 1726867452.29537: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.29544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.29557: variable 'omit' from source: magic vars 23826 1726867452.29825: variable 'ansible_distribution_major_version' from source: facts 23826 1726867452.29834: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867452.29839: variable 'omit' from source: magic vars 23826 1726867452.29874: variable 'omit' from source: magic vars 23826 1726867452.29896: variable 'omit' from source: magic vars 23826 1726867452.29929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867452.29956: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867452.29974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867452.29990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867452.29999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867452.30024: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867452.30028: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.30031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.30102: Set connection var ansible_timeout to 10 23826 1726867452.30109: Set connection var ansible_shell_executable to /bin/sh 23826 1726867452.30115: Set connection var ansible_connection to ssh 23826 1726867452.30122: Set connection var ansible_pipelining to False 23826 1726867452.30125: Set connection var ansible_shell_type to sh 23826 1726867452.30129: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867452.30147: variable 'ansible_shell_executable' from source: unknown 23826 1726867452.30150: variable 'ansible_connection' from source: unknown 23826 1726867452.30153: variable 'ansible_module_compression' from source: unknown 23826 1726867452.30155: variable 'ansible_shell_type' from source: unknown 23826 1726867452.30158: variable 'ansible_shell_executable' from source: unknown 23826 1726867452.30160: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.30162: variable 'ansible_pipelining' from source: unknown 23826 1726867452.30164: variable 'ansible_timeout' from source: unknown 23826 1726867452.30169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.30314: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867452.30323: variable 'omit' from source: magic vars 23826 1726867452.30328: starting attempt loop 23826 1726867452.30331: running the handler 23826 1726867452.30342: _low_level_execute_command(): starting 23826 1726867452.30349: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867452.30853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.30857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867452.30861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.30919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.30926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867452.30928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.30976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.32669: stdout chunk (state=3): >>>/root <<< 23826 1726867452.32763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.32789: stderr chunk (state=3): >>><<< 23826 1726867452.32794: stdout chunk (state=3): >>><<< 23826 1726867452.32816: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867452.32827: _low_level_execute_command(): starting 23826 1726867452.32833: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289 `" && echo ansible-tmp-1726867452.3281546-25505-105937091108289="` echo /root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289 `" ) && sleep 0' 23826 1726867452.33265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.33268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.33271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867452.33281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.33324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.33328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867452.33332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.33375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.35285: stdout chunk (state=3): >>>ansible-tmp-1726867452.3281546-25505-105937091108289=/root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289 <<< 23826 1726867452.35393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.35420: stderr chunk (state=3): >>><<< 23826 1726867452.35423: stdout chunk (state=3): >>><<< 23826 1726867452.35436: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867452.3281546-25505-105937091108289=/root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867452.35469: variable 'ansible_module_compression' from source: unknown 23826 1726867452.35501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 23826 1726867452.35537: variable 'ansible_facts' from source: unknown 23826 1726867452.35591: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/AnsiballZ_ping.py 23826 1726867452.35687: Sending initial data 23826 1726867452.35690: Sent initial data (153 bytes) 23826 1726867452.36104: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867452.36110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867452.36112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.36114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.36116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.36159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.36163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.36208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.37766: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 23826 1726867452.37770: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867452.37804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867452.37843: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmphhoi_zmi /root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/AnsiballZ_ping.py <<< 23826 1726867452.37846: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/AnsiballZ_ping.py" <<< 23826 1726867452.37880: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmphhoi_zmi" to remote "/root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/AnsiballZ_ping.py" <<< 23826 1726867452.38372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.38405: stderr chunk (state=3): >>><<< 23826 1726867452.38411: stdout chunk (state=3): >>><<< 23826 1726867452.38450: done transferring module to remote 23826 1726867452.38458: _low_level_execute_command(): starting 23826 1726867452.38461: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/ /root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/AnsiballZ_ping.py && sleep 0' 23826 1726867452.38878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.38882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867452.38884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867452.38886: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.38888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.38941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867452.38944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.38987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.40764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.40786: stderr chunk (state=3): >>><<< 23826 1726867452.40789: stdout chunk (state=3): >>><<< 23826 1726867452.40801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867452.40804: _low_level_execute_command(): starting 23826 1726867452.40808: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/AnsiballZ_ping.py && sleep 0' 23826 1726867452.41213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867452.41216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867452.41219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.41221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.41223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.41276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.41283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.41319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.56522: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 23826 1726867452.57922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867452.57927: stdout chunk (state=3): >>><<< 23826 1726867452.57929: stderr chunk (state=3): >>><<< 23826 1726867452.57949: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867452.58059: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867452.58063: _low_level_execute_command(): starting 23826 1726867452.58065: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867452.3281546-25505-105937091108289/ > /dev/null 2>&1 && sleep 0' 23826 1726867452.58621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867452.58638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867452.58652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.58668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867452.58687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867452.58698: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867452.58714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.58742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.58797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.58844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.58862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867452.58892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.58971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.60991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.60995: stdout chunk (state=3): >>><<< 23826 1726867452.60997: stderr chunk (state=3): >>><<< 23826 1726867452.61000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867452.61002: handler run complete 23826 1726867452.61004: attempt loop complete, returning result 23826 1726867452.61006: _execute() done 23826 1726867452.61011: dumping result to json 23826 1726867452.61013: done dumping result, returning 23826 1726867452.61016: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-a92d-a3ea-000000000099] 23826 1726867452.61018: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000099 23826 1726867452.61094: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000099 ok: [managed_node2] => { "changed": false, "ping": "pong" } 23826 1726867452.61161: no more pending results, returning what we have 23826 1726867452.61165: results queue empty 23826 1726867452.61166: checking for any_errors_fatal 23826 1726867452.61173: done checking for any_errors_fatal 23826 1726867452.61174: checking for max_fail_percentage 23826 1726867452.61176: done checking for max_fail_percentage 23826 1726867452.61183: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.61185: done checking to see if all hosts have failed 23826 1726867452.61186: getting the remaining hosts for this loop 23826 1726867452.61187: done getting the remaining hosts for this loop 23826 1726867452.61192: getting the next task for host managed_node2 23826 1726867452.61205: done getting next task for host managed_node2 23826 1726867452.61210: ^ task is: TASK: meta (role_complete) 23826 1726867452.61212: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.61389: getting variables 23826 1726867452.61391: in VariableManager get_vars() 23826 1726867452.61435: Calling all_inventory to load vars for managed_node2 23826 1726867452.61439: Calling groups_inventory to load vars for managed_node2 23826 1726867452.61441: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.61451: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.61454: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.61457: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.61995: WORKER PROCESS EXITING 23826 1726867452.63135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.64883: done with get_vars() 23826 1726867452.64904: done getting variables 23826 1726867452.65029: done queuing things up, now waiting for results queue to drain 23826 1726867452.65030: results queue empty 23826 1726867452.65031: checking for any_errors_fatal 23826 1726867452.65033: done checking for any_errors_fatal 23826 1726867452.65033: checking for max_fail_percentage 23826 1726867452.65034: done checking for max_fail_percentage 23826 1726867452.65034: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.65035: done checking to see if all hosts have failed 23826 1726867452.65035: getting the remaining hosts for this loop 23826 1726867452.65036: done getting the remaining hosts for this loop 23826 1726867452.65037: getting the next task for host managed_node2 23826 1726867452.65040: done getting next task for host managed_node2 23826 1726867452.65045: ^ task is: TASK: meta (flush_handlers) 23826 1726867452.65047: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.65053: getting variables 23826 1726867452.65054: in VariableManager get_vars() 23826 1726867452.65066: Calling all_inventory to load vars for managed_node2 23826 1726867452.65070: Calling groups_inventory to load vars for managed_node2 23826 1726867452.65072: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.65079: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.65080: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.65082: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.66264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.67460: done with get_vars() 23826 1726867452.67481: done getting variables 23826 1726867452.67526: in VariableManager get_vars() 23826 1726867452.67538: Calling all_inventory to load vars for managed_node2 23826 1726867452.67540: Calling groups_inventory to load vars for managed_node2 23826 1726867452.67542: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.67546: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.67549: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.67551: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.68704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.70752: done with get_vars() 23826 1726867452.70776: done queuing things up, now waiting for results queue to drain 23826 1726867452.70781: results queue empty 23826 1726867452.70782: checking for any_errors_fatal 23826 1726867452.70783: done checking for any_errors_fatal 23826 1726867452.70784: checking for max_fail_percentage 23826 1726867452.70785: done checking for max_fail_percentage 23826 1726867452.70785: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.70786: done checking to see if all hosts have failed 23826 1726867452.70787: getting the remaining hosts for this loop 23826 1726867452.70788: done getting the remaining hosts for this loop 23826 1726867452.70790: getting the next task for host managed_node2 23826 1726867452.70794: done getting next task for host managed_node2 23826 1726867452.70795: ^ task is: TASK: meta (flush_handlers) 23826 1726867452.70797: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.70804: getting variables 23826 1726867452.70805: in VariableManager get_vars() 23826 1726867452.70815: Calling all_inventory to load vars for managed_node2 23826 1726867452.70818: Calling groups_inventory to load vars for managed_node2 23826 1726867452.70819: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.70824: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.70827: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.70830: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.71995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.74900: done with get_vars() 23826 1726867452.74922: done getting variables 23826 1726867452.74975: in VariableManager get_vars() 23826 1726867452.75072: Calling all_inventory to load vars for managed_node2 23826 1726867452.75074: Calling groups_inventory to load vars for managed_node2 23826 1726867452.75076: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.75083: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.75085: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.75088: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.76549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.79284: done with get_vars() 23826 1726867452.79309: done queuing things up, now waiting for results queue to drain 23826 1726867452.79311: results queue empty 23826 1726867452.79312: checking for any_errors_fatal 23826 1726867452.79314: done checking for any_errors_fatal 23826 1726867452.79314: checking for max_fail_percentage 23826 1726867452.79315: done checking for max_fail_percentage 23826 1726867452.79316: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.79317: done checking to see if all hosts have failed 23826 1726867452.79317: getting the remaining hosts for this loop 23826 1726867452.79318: done getting the remaining hosts for this loop 23826 1726867452.79321: getting the next task for host managed_node2 23826 1726867452.79325: done getting next task for host managed_node2 23826 1726867452.79331: ^ task is: None 23826 1726867452.79332: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.79334: done queuing things up, now waiting for results queue to drain 23826 1726867452.79335: results queue empty 23826 1726867452.79335: checking for any_errors_fatal 23826 1726867452.79336: done checking for any_errors_fatal 23826 1726867452.79337: checking for max_fail_percentage 23826 1726867452.79338: done checking for max_fail_percentage 23826 1726867452.79339: checking to see if all hosts have failed and the running result is not ok 23826 1726867452.79339: done checking to see if all hosts have failed 23826 1726867452.79341: getting the next task for host managed_node2 23826 1726867452.79344: done getting next task for host managed_node2 23826 1726867452.79344: ^ task is: None 23826 1726867452.79346: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.79396: in VariableManager get_vars() 23826 1726867452.79412: done with get_vars() 23826 1726867452.79419: in VariableManager get_vars() 23826 1726867452.79429: done with get_vars() 23826 1726867452.79439: variable 'omit' from source: magic vars 23826 1726867452.79473: in VariableManager get_vars() 23826 1726867452.79486: done with get_vars() 23826 1726867452.79509: variable 'omit' from source: magic vars PLAY [Delete the interface, then assert that device and profile are absent] **** 23826 1726867452.79770: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 23826 1726867452.79795: getting the remaining hosts for this loop 23826 1726867452.79796: done getting the remaining hosts for this loop 23826 1726867452.79798: getting the next task for host managed_node2 23826 1726867452.79801: done getting next task for host managed_node2 23826 1726867452.79803: ^ task is: TASK: Gathering Facts 23826 1726867452.79804: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867452.79806: getting variables 23826 1726867452.79807: in VariableManager get_vars() 23826 1726867452.79815: Calling all_inventory to load vars for managed_node2 23826 1726867452.79817: Calling groups_inventory to load vars for managed_node2 23826 1726867452.79819: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867452.79824: Calling all_plugins_play to load vars for managed_node2 23826 1726867452.79827: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867452.79830: Calling groups_plugins_play to load vars for managed_node2 23826 1726867452.80533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867452.81654: done with get_vars() 23826 1726867452.81672: done getting variables 23826 1726867452.81714: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Friday 20 September 2024 17:24:12 -0400 (0:00:00.527) 0:00:34.828 ****** 23826 1726867452.81737: entering _queue_task() for managed_node2/gather_facts 23826 1726867452.82045: worker is 1 (out of 1 available) 23826 1726867452.82058: exiting _queue_task() for managed_node2/gather_facts 23826 1726867452.82069: done queuing things up, now waiting for results queue to drain 23826 1726867452.82070: waiting for pending results... 23826 1726867452.82360: running TaskExecutor() for managed_node2/TASK: Gathering Facts 23826 1726867452.82438: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000005ee 23826 1726867452.82470: variable 'ansible_search_path' from source: unknown 23826 1726867452.82518: calling self._execute() 23826 1726867452.82679: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.82683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.82686: variable 'omit' from source: magic vars 23826 1726867452.83073: variable 'ansible_distribution_major_version' from source: facts 23826 1726867452.83091: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867452.83101: variable 'omit' from source: magic vars 23826 1726867452.83141: variable 'omit' from source: magic vars 23826 1726867452.83184: variable 'omit' from source: magic vars 23826 1726867452.83333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867452.83338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867452.83341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867452.83343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867452.83352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867452.83389: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867452.83398: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.83405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.83516: Set connection var ansible_timeout to 10 23826 1726867452.83531: Set connection var ansible_shell_executable to /bin/sh 23826 1726867452.83538: Set connection var ansible_connection to ssh 23826 1726867452.83558: Set connection var ansible_pipelining to False 23826 1726867452.83565: Set connection var ansible_shell_type to sh 23826 1726867452.83574: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867452.83604: variable 'ansible_shell_executable' from source: unknown 23826 1726867452.83616: variable 'ansible_connection' from source: unknown 23826 1726867452.83623: variable 'ansible_module_compression' from source: unknown 23826 1726867452.83630: variable 'ansible_shell_type' from source: unknown 23826 1726867452.83637: variable 'ansible_shell_executable' from source: unknown 23826 1726867452.83643: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867452.83650: variable 'ansible_pipelining' from source: unknown 23826 1726867452.83665: variable 'ansible_timeout' from source: unknown 23826 1726867452.83673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867452.83881: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867452.83885: variable 'omit' from source: magic vars 23826 1726867452.83895: starting attempt loop 23826 1726867452.83987: running the handler 23826 1726867452.83991: variable 'ansible_facts' from source: unknown 23826 1726867452.83993: _low_level_execute_command(): starting 23826 1726867452.83996: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867452.84694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867452.84715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867452.84732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.84760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867452.84875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867452.84901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.84989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.86680: stdout chunk (state=3): >>>/root <<< 23826 1726867452.86795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.86842: stderr chunk (state=3): >>><<< 23826 1726867452.86863: stdout chunk (state=3): >>><<< 23826 1726867452.86885: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867452.86978: _low_level_execute_command(): starting 23826 1726867452.86983: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630 `" && echo ansible-tmp-1726867452.8689108-25537-28952204697630="` echo /root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630 `" ) && sleep 0' 23826 1726867452.87532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867452.87546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867452.87563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867452.87641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867452.87645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.87721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.87803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.89762: stdout chunk (state=3): >>>ansible-tmp-1726867452.8689108-25537-28952204697630=/root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630 <<< 23826 1726867452.89911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.89941: stdout chunk (state=3): >>><<< 23826 1726867452.89944: stderr chunk (state=3): >>><<< 23826 1726867452.89961: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867452.8689108-25537-28952204697630=/root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867452.89996: variable 'ansible_module_compression' from source: unknown 23826 1726867452.90153: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 23826 1726867452.90156: variable 'ansible_facts' from source: unknown 23826 1726867452.90351: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/AnsiballZ_setup.py 23826 1726867452.90505: Sending initial data 23826 1726867452.90613: Sent initial data (153 bytes) 23826 1726867452.91262: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.91286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867452.91307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.91394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.93054: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867452.93089: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867452.93126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpnme4xuol /root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/AnsiballZ_setup.py <<< 23826 1726867452.93130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/AnsiballZ_setup.py" <<< 23826 1726867452.93192: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpnme4xuol" to remote "/root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/AnsiballZ_setup.py" <<< 23826 1726867452.94819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.94856: stderr chunk (state=3): >>><<< 23826 1726867452.94913: stdout chunk (state=3): >>><<< 23826 1726867452.94917: done transferring module to remote 23826 1726867452.94921: _low_level_execute_command(): starting 23826 1726867452.94932: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/ /root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/AnsiballZ_setup.py && sleep 0' 23826 1726867452.96005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867452.96048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.96065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867452.96091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.96215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867452.98072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867452.98090: stdout chunk (state=3): >>><<< 23826 1726867452.98112: stderr chunk (state=3): >>><<< 23826 1726867452.98184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867452.98192: _low_level_execute_command(): starting 23826 1726867452.98195: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/AnsiballZ_setup.py && sleep 0' 23826 1726867452.99158: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867452.99178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867452.99214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867452.99618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867452.99712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867452.99797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867453.66694: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "24", "second": "13", "epoch": "1726867453", "epoch_int": "1726867453", "date": "2024-09-20", "time": "17:24:13", "iso8601_micro": "2024-09-20T21:24:13.271726Z", "iso8601": "2024-09-20T21:24:13Z", "iso8601_basic": "20240920T172413271726", "iso8601_basic_short": "20240920T172413", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSI<<< 23826 1726867453.66834: stdout chunk (state=3): >>>ON_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.4599609375, "5m": 0.38720703125, "15m": 0.22216796875}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["ethtest0", "eth0", "lo", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:b4:26:aa:e3:d8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4b4:26ff:feaa:e3d8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ba:29:a4:e0:1c:3b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b829:a4ff:fee0:1c3b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::f4b4:26ff:feaa:e3d8", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b", "fe80::f4b4:26ff:feaa:e3d8"]}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 691, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794500608, "block_size": 4096, "block_total": 65519099, "block_available": 63914673, "block_used": 1604426, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 23826 1726867453.68735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867453.68887: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 23826 1726867453.68921: stderr chunk (state=3): >>><<< 23826 1726867453.68924: stdout chunk (state=3): >>><<< 23826 1726867453.69014: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "24", "second": "13", "epoch": "1726867453", "epoch_int": "1726867453", "date": "2024-09-20", "time": "17:24:13", "iso8601_micro": "2024-09-20T21:24:13.271726Z", "iso8601": "2024-09-20T21:24:13Z", "iso8601_basic": "20240920T172413271726", "iso8601_basic_short": "20240920T172413", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.4599609375, "5m": 0.38720703125, "15m": 0.22216796875}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["ethtest0", "eth0", "lo", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:b4:26:aa:e3:d8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4b4:26ff:feaa:e3d8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ba:29:a4:e0:1c:3b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b829:a4ff:fee0:1c3b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::f4b4:26ff:feaa:e3d8", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::b829:a4ff:fee0:1c3b", "fe80::f4b4:26ff:feaa:e3d8"]}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 691, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794500608, "block_size": 4096, "block_total": 65519099, "block_available": 63914673, "block_used": 1604426, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867453.70040: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867453.70043: _low_level_execute_command(): starting 23826 1726867453.70223: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867452.8689108-25537-28952204697630/ > /dev/null 2>&1 && sleep 0' 23826 1726867453.71234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867453.71247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867453.71264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867453.71291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867453.71390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867453.71416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867453.71497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867453.73385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867453.73388: stdout chunk (state=3): >>><<< 23826 1726867453.73390: stderr chunk (state=3): >>><<< 23826 1726867453.73405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867453.73786: handler run complete 23826 1726867453.73789: variable 'ansible_facts' from source: unknown 23826 1726867453.73960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867453.74681: variable 'ansible_facts' from source: unknown 23826 1726867453.74929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867453.75331: attempt loop complete, returning result 23826 1726867453.75341: _execute() done 23826 1726867453.75347: dumping result to json 23826 1726867453.75391: done dumping result, returning 23826 1726867453.75449: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-a92d-a3ea-0000000005ee] 23826 1726867453.75458: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000005ee 23826 1726867453.76884: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000005ee 23826 1726867453.76887: WORKER PROCESS EXITING ok: [managed_node2] 23826 1726867453.77496: no more pending results, returning what we have 23826 1726867453.77499: results queue empty 23826 1726867453.77500: checking for any_errors_fatal 23826 1726867453.77501: done checking for any_errors_fatal 23826 1726867453.77502: checking for max_fail_percentage 23826 1726867453.77504: done checking for max_fail_percentage 23826 1726867453.77505: checking to see if all hosts have failed and the running result is not ok 23826 1726867453.77506: done checking to see if all hosts have failed 23826 1726867453.77507: getting the remaining hosts for this loop 23826 1726867453.77508: done getting the remaining hosts for this loop 23826 1726867453.77511: getting the next task for host managed_node2 23826 1726867453.77517: done getting next task for host managed_node2 23826 1726867453.77519: ^ task is: TASK: meta (flush_handlers) 23826 1726867453.77521: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867453.77525: getting variables 23826 1726867453.77526: in VariableManager get_vars() 23826 1726867453.77548: Calling all_inventory to load vars for managed_node2 23826 1726867453.77551: Calling groups_inventory to load vars for managed_node2 23826 1726867453.77554: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867453.77564: Calling all_plugins_play to load vars for managed_node2 23826 1726867453.77567: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867453.77570: Calling groups_plugins_play to load vars for managed_node2 23826 1726867453.79486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867453.82273: done with get_vars() 23826 1726867453.82298: done getting variables 23826 1726867453.82366: in VariableManager get_vars() 23826 1726867453.82488: Calling all_inventory to load vars for managed_node2 23826 1726867453.82491: Calling groups_inventory to load vars for managed_node2 23826 1726867453.82494: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867453.82499: Calling all_plugins_play to load vars for managed_node2 23826 1726867453.82501: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867453.82503: Calling groups_plugins_play to load vars for managed_node2 23826 1726867453.85207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867453.87315: done with get_vars() 23826 1726867453.87341: done queuing things up, now waiting for results queue to drain 23826 1726867453.87344: results queue empty 23826 1726867453.87345: checking for any_errors_fatal 23826 1726867453.87353: done checking for any_errors_fatal 23826 1726867453.87354: checking for max_fail_percentage 23826 1726867453.87355: done checking for max_fail_percentage 23826 1726867453.87364: checking to see if all hosts have failed and the running result is not ok 23826 1726867453.87365: done checking to see if all hosts have failed 23826 1726867453.87366: getting the remaining hosts for this loop 23826 1726867453.87367: done getting the remaining hosts for this loop 23826 1726867453.87370: getting the next task for host managed_node2 23826 1726867453.87374: done getting next task for host managed_node2 23826 1726867453.87376: ^ task is: TASK: Include the task 'delete_interface.yml' 23826 1726867453.87381: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867453.87383: getting variables 23826 1726867453.87384: in VariableManager get_vars() 23826 1726867453.87392: Calling all_inventory to load vars for managed_node2 23826 1726867453.87394: Calling groups_inventory to load vars for managed_node2 23826 1726867453.87396: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867453.87401: Calling all_plugins_play to load vars for managed_node2 23826 1726867453.87403: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867453.87406: Calling groups_plugins_play to load vars for managed_node2 23826 1726867453.89199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867453.90659: done with get_vars() 23826 1726867453.90682: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:83 Friday 20 September 2024 17:24:13 -0400 (0:00:01.090) 0:00:35.918 ****** 23826 1726867453.90762: entering _queue_task() for managed_node2/include_tasks 23826 1726867453.91130: worker is 1 (out of 1 available) 23826 1726867453.91144: exiting _queue_task() for managed_node2/include_tasks 23826 1726867453.91155: done queuing things up, now waiting for results queue to drain 23826 1726867453.91156: waiting for pending results... 23826 1726867453.91914: running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' 23826 1726867453.91959: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000009c 23826 1726867453.91983: variable 'ansible_search_path' from source: unknown 23826 1726867453.92029: calling self._execute() 23826 1726867453.92385: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867453.92389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867453.92392: variable 'omit' from source: magic vars 23826 1726867453.92838: variable 'ansible_distribution_major_version' from source: facts 23826 1726867453.93007: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867453.93019: _execute() done 23826 1726867453.93089: dumping result to json 23826 1726867453.93102: done dumping result, returning 23826 1726867453.93114: done running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' [0affcac9-a3a5-a92d-a3ea-00000000009c] 23826 1726867453.93123: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000009c 23826 1726867453.93254: no more pending results, returning what we have 23826 1726867453.93259: in VariableManager get_vars() 23826 1726867453.93299: Calling all_inventory to load vars for managed_node2 23826 1726867453.93302: Calling groups_inventory to load vars for managed_node2 23826 1726867453.93305: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867453.93321: Calling all_plugins_play to load vars for managed_node2 23826 1726867453.93324: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867453.93327: Calling groups_plugins_play to load vars for managed_node2 23826 1726867453.94292: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000009c 23826 1726867453.94296: WORKER PROCESS EXITING 23826 1726867453.95252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867453.96947: done with get_vars() 23826 1726867453.96971: variable 'ansible_search_path' from source: unknown 23826 1726867453.96990: we have included files to process 23826 1726867453.96992: generating all_blocks data 23826 1726867453.96993: done generating all_blocks data 23826 1726867453.96994: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 23826 1726867453.96995: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 23826 1726867453.96998: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 23826 1726867453.97248: done processing included file 23826 1726867453.97250: iterating over new_blocks loaded from include file 23826 1726867453.97252: in VariableManager get_vars() 23826 1726867453.97264: done with get_vars() 23826 1726867453.97266: filtering new block on tags 23826 1726867453.97482: done filtering new block on tags 23826 1726867453.97485: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 23826 1726867453.97490: extending task lists for all hosts with included blocks 23826 1726867453.97572: done extending task lists 23826 1726867453.97573: done processing included files 23826 1726867453.97574: results queue empty 23826 1726867453.97575: checking for any_errors_fatal 23826 1726867453.97576: done checking for any_errors_fatal 23826 1726867453.97578: checking for max_fail_percentage 23826 1726867453.97579: done checking for max_fail_percentage 23826 1726867453.97580: checking to see if all hosts have failed and the running result is not ok 23826 1726867453.97581: done checking to see if all hosts have failed 23826 1726867453.97581: getting the remaining hosts for this loop 23826 1726867453.97582: done getting the remaining hosts for this loop 23826 1726867453.97585: getting the next task for host managed_node2 23826 1726867453.97588: done getting next task for host managed_node2 23826 1726867453.97590: ^ task is: TASK: Remove test interface if necessary 23826 1726867453.97592: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867453.97594: getting variables 23826 1726867453.97595: in VariableManager get_vars() 23826 1726867453.97604: Calling all_inventory to load vars for managed_node2 23826 1726867453.97606: Calling groups_inventory to load vars for managed_node2 23826 1726867453.97608: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867453.97614: Calling all_plugins_play to load vars for managed_node2 23826 1726867453.97882: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867453.97886: Calling groups_plugins_play to load vars for managed_node2 23826 1726867453.99373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867454.01156: done with get_vars() 23826 1726867454.01194: done getting variables 23826 1726867454.01240: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 17:24:14 -0400 (0:00:00.105) 0:00:36.024 ****** 23826 1726867454.01273: entering _queue_task() for managed_node2/command 23826 1726867454.01987: worker is 1 (out of 1 available) 23826 1726867454.02001: exiting _queue_task() for managed_node2/command 23826 1726867454.02012: done queuing things up, now waiting for results queue to drain 23826 1726867454.02013: waiting for pending results... 23826 1726867454.02315: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 23826 1726867454.02429: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000005ff 23826 1726867454.02472: variable 'ansible_search_path' from source: unknown 23826 1726867454.02484: variable 'ansible_search_path' from source: unknown 23826 1726867454.02531: calling self._execute() 23826 1726867454.02718: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.02722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.02725: variable 'omit' from source: magic vars 23826 1726867454.03061: variable 'ansible_distribution_major_version' from source: facts 23826 1726867454.03081: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867454.03094: variable 'omit' from source: magic vars 23826 1726867454.03138: variable 'omit' from source: magic vars 23826 1726867454.03242: variable 'interface' from source: set_fact 23826 1726867454.03273: variable 'omit' from source: magic vars 23826 1726867454.03320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867454.03365: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867454.03397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867454.03420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867454.03438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867454.03482: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867454.03582: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.03588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.03606: Set connection var ansible_timeout to 10 23826 1726867454.03621: Set connection var ansible_shell_executable to /bin/sh 23826 1726867454.03629: Set connection var ansible_connection to ssh 23826 1726867454.03641: Set connection var ansible_pipelining to False 23826 1726867454.03648: Set connection var ansible_shell_type to sh 23826 1726867454.03660: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867454.03691: variable 'ansible_shell_executable' from source: unknown 23826 1726867454.03704: variable 'ansible_connection' from source: unknown 23826 1726867454.03712: variable 'ansible_module_compression' from source: unknown 23826 1726867454.03810: variable 'ansible_shell_type' from source: unknown 23826 1726867454.03813: variable 'ansible_shell_executable' from source: unknown 23826 1726867454.03816: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.03818: variable 'ansible_pipelining' from source: unknown 23826 1726867454.03820: variable 'ansible_timeout' from source: unknown 23826 1726867454.03822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.03898: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867454.03918: variable 'omit' from source: magic vars 23826 1726867454.03930: starting attempt loop 23826 1726867454.03938: running the handler 23826 1726867454.03959: _low_level_execute_command(): starting 23826 1726867454.03971: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867454.04776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.04794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.04870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.06574: stdout chunk (state=3): >>>/root <<< 23826 1726867454.06729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.06732: stdout chunk (state=3): >>><<< 23826 1726867454.06735: stderr chunk (state=3): >>><<< 23826 1726867454.06848: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867454.06854: _low_level_execute_command(): starting 23826 1726867454.06858: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706 `" && echo ansible-tmp-1726867454.0676649-25582-272572422813706="` echo /root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706 `" ) && sleep 0' 23826 1726867454.07439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.07468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.07543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.09511: stdout chunk (state=3): >>>ansible-tmp-1726867454.0676649-25582-272572422813706=/root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706 <<< 23826 1726867454.09744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.09755: stdout chunk (state=3): >>><<< 23826 1726867454.09766: stderr chunk (state=3): >>><<< 23826 1726867454.09986: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867454.0676649-25582-272572422813706=/root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867454.09990: variable 'ansible_module_compression' from source: unknown 23826 1726867454.09993: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867454.09995: variable 'ansible_facts' from source: unknown 23826 1726867454.10011: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/AnsiballZ_command.py 23826 1726867454.10223: Sending initial data 23826 1726867454.10232: Sent initial data (156 bytes) 23826 1726867454.10802: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867454.10817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867454.10832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867454.10857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867454.10876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867454.10897: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867454.10975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867454.11006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867454.11023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.11051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.11117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.12735: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867454.12798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867454.12872: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpqhud5ajg /root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/AnsiballZ_command.py <<< 23826 1726867454.12892: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/AnsiballZ_command.py" <<< 23826 1726867454.12931: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpqhud5ajg" to remote "/root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/AnsiballZ_command.py" <<< 23826 1726867454.13965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.13970: stderr chunk (state=3): >>><<< 23826 1726867454.13973: stdout chunk (state=3): >>><<< 23826 1726867454.13975: done transferring module to remote 23826 1726867454.13979: _low_level_execute_command(): starting 23826 1726867454.13982: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/ /root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/AnsiballZ_command.py && sleep 0' 23826 1726867454.14860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867454.14902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.14918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.14999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.16876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.16892: stdout chunk (state=3): >>><<< 23826 1726867454.16906: stderr chunk (state=3): >>><<< 23826 1726867454.16924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867454.16931: _low_level_execute_command(): starting 23826 1726867454.16939: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/AnsiballZ_command.py && sleep 0' 23826 1726867454.17544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867454.17559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867454.17572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867454.17596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867454.17648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867454.17711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867454.17730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.17764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.17842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.34885: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 17:24:14.331927", "end": "2024-09-20 17:24:14.344239", "delta": "0:00:00.012312", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867454.37051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867454.37169: stderr chunk (state=3): >>><<< 23826 1726867454.37173: stdout chunk (state=3): >>><<< 23826 1726867454.37176: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 17:24:14.331927", "end": "2024-09-20 17:24:14.344239", "delta": "0:00:00.012312", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867454.37182: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867454.37185: _low_level_execute_command(): starting 23826 1726867454.37187: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867454.0676649-25582-272572422813706/ > /dev/null 2>&1 && sleep 0' 23826 1726867454.37837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867454.37888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867454.38167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.38191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.38274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.40154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.40212: stderr chunk (state=3): >>><<< 23826 1726867454.40225: stdout chunk (state=3): >>><<< 23826 1726867454.40246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867454.40257: handler run complete 23826 1726867454.40296: Evaluated conditional (False): False 23826 1726867454.40311: attempt loop complete, returning result 23826 1726867454.40318: _execute() done 23826 1726867454.40324: dumping result to json 23826 1726867454.40483: done dumping result, returning 23826 1726867454.40486: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0affcac9-a3a5-a92d-a3ea-0000000005ff] 23826 1726867454.40488: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000005ff 23826 1726867454.40556: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000005ff 23826 1726867454.40559: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.012312", "end": "2024-09-20 17:24:14.344239", "rc": 0, "start": "2024-09-20 17:24:14.331927" } 23826 1726867454.40629: no more pending results, returning what we have 23826 1726867454.40633: results queue empty 23826 1726867454.40633: checking for any_errors_fatal 23826 1726867454.40635: done checking for any_errors_fatal 23826 1726867454.40636: checking for max_fail_percentage 23826 1726867454.40637: done checking for max_fail_percentage 23826 1726867454.40638: checking to see if all hosts have failed and the running result is not ok 23826 1726867454.40639: done checking to see if all hosts have failed 23826 1726867454.40640: getting the remaining hosts for this loop 23826 1726867454.40642: done getting the remaining hosts for this loop 23826 1726867454.40645: getting the next task for host managed_node2 23826 1726867454.40653: done getting next task for host managed_node2 23826 1726867454.40655: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 23826 1726867454.40657: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867454.40662: getting variables 23826 1726867454.40663: in VariableManager get_vars() 23826 1726867454.40701: Calling all_inventory to load vars for managed_node2 23826 1726867454.40704: Calling groups_inventory to load vars for managed_node2 23826 1726867454.40707: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867454.40719: Calling all_plugins_play to load vars for managed_node2 23826 1726867454.40721: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867454.40724: Calling groups_plugins_play to load vars for managed_node2 23826 1726867454.42441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867454.44068: done with get_vars() 23826 1726867454.44099: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:85 Friday 20 September 2024 17:24:14 -0400 (0:00:00.429) 0:00:36.453 ****** 23826 1726867454.44210: entering _queue_task() for managed_node2/include_tasks 23826 1726867454.44608: worker is 1 (out of 1 available) 23826 1726867454.44621: exiting _queue_task() for managed_node2/include_tasks 23826 1726867454.44634: done queuing things up, now waiting for results queue to drain 23826 1726867454.44636: waiting for pending results... 23826 1726867454.45000: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' 23826 1726867454.45016: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000009d 23826 1726867454.45036: variable 'ansible_search_path' from source: unknown 23826 1726867454.45079: calling self._execute() 23826 1726867454.45221: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.45228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.45232: variable 'omit' from source: magic vars 23826 1726867454.45608: variable 'ansible_distribution_major_version' from source: facts 23826 1726867454.45624: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867454.45632: _execute() done 23826 1726867454.45638: dumping result to json 23826 1726867454.45644: done dumping result, returning 23826 1726867454.45662: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' [0affcac9-a3a5-a92d-a3ea-00000000009d] 23826 1726867454.45768: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000009d 23826 1726867454.45840: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000009d 23826 1726867454.45844: WORKER PROCESS EXITING 23826 1726867454.45899: no more pending results, returning what we have 23826 1726867454.45903: in VariableManager get_vars() 23826 1726867454.45941: Calling all_inventory to load vars for managed_node2 23826 1726867454.45944: Calling groups_inventory to load vars for managed_node2 23826 1726867454.45948: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867454.46171: Calling all_plugins_play to load vars for managed_node2 23826 1726867454.46174: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867454.46180: Calling groups_plugins_play to load vars for managed_node2 23826 1726867454.47506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867454.49296: done with get_vars() 23826 1726867454.49315: variable 'ansible_search_path' from source: unknown 23826 1726867454.49329: we have included files to process 23826 1726867454.49330: generating all_blocks data 23826 1726867454.49332: done generating all_blocks data 23826 1726867454.49337: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 23826 1726867454.49338: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 23826 1726867454.49341: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 23826 1726867454.49514: in VariableManager get_vars() 23826 1726867454.49533: done with get_vars() 23826 1726867454.49647: done processing included file 23826 1726867454.49650: iterating over new_blocks loaded from include file 23826 1726867454.49651: in VariableManager get_vars() 23826 1726867454.49665: done with get_vars() 23826 1726867454.49666: filtering new block on tags 23826 1726867454.49690: done filtering new block on tags 23826 1726867454.49692: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 23826 1726867454.49697: extending task lists for all hosts with included blocks 23826 1726867454.49842: done extending task lists 23826 1726867454.49844: done processing included files 23826 1726867454.49845: results queue empty 23826 1726867454.49845: checking for any_errors_fatal 23826 1726867454.49850: done checking for any_errors_fatal 23826 1726867454.49851: checking for max_fail_percentage 23826 1726867454.49852: done checking for max_fail_percentage 23826 1726867454.49853: checking to see if all hosts have failed and the running result is not ok 23826 1726867454.49854: done checking to see if all hosts have failed 23826 1726867454.49854: getting the remaining hosts for this loop 23826 1726867454.49856: done getting the remaining hosts for this loop 23826 1726867454.49858: getting the next task for host managed_node2 23826 1726867454.49862: done getting next task for host managed_node2 23826 1726867454.49864: ^ task is: TASK: Include the task 'get_profile_stat.yml' 23826 1726867454.49867: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867454.49869: getting variables 23826 1726867454.49870: in VariableManager get_vars() 23826 1726867454.49880: Calling all_inventory to load vars for managed_node2 23826 1726867454.49883: Calling groups_inventory to load vars for managed_node2 23826 1726867454.49885: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867454.49891: Calling all_plugins_play to load vars for managed_node2 23826 1726867454.49893: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867454.49901: Calling groups_plugins_play to load vars for managed_node2 23826 1726867454.51055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867454.52685: done with get_vars() 23826 1726867454.52708: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 17:24:14 -0400 (0:00:00.085) 0:00:36.539 ****** 23826 1726867454.52789: entering _queue_task() for managed_node2/include_tasks 23826 1726867454.53132: worker is 1 (out of 1 available) 23826 1726867454.53144: exiting _queue_task() for managed_node2/include_tasks 23826 1726867454.53158: done queuing things up, now waiting for results queue to drain 23826 1726867454.53160: waiting for pending results... 23826 1726867454.53509: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 23826 1726867454.53559: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000612 23826 1726867454.53605: variable 'ansible_search_path' from source: unknown 23826 1726867454.53608: variable 'ansible_search_path' from source: unknown 23826 1726867454.53632: calling self._execute() 23826 1726867454.53736: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.53783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.53787: variable 'omit' from source: magic vars 23826 1726867454.54139: variable 'ansible_distribution_major_version' from source: facts 23826 1726867454.54163: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867454.54171: _execute() done 23826 1726867454.54181: dumping result to json 23826 1726867454.54190: done dumping result, returning 23826 1726867454.54260: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-a92d-a3ea-000000000612] 23826 1726867454.54264: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000612 23826 1726867454.54340: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000612 23826 1726867454.54344: WORKER PROCESS EXITING 23826 1726867454.54398: no more pending results, returning what we have 23826 1726867454.54404: in VariableManager get_vars() 23826 1726867454.54445: Calling all_inventory to load vars for managed_node2 23826 1726867454.54449: Calling groups_inventory to load vars for managed_node2 23826 1726867454.54453: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867454.54471: Calling all_plugins_play to load vars for managed_node2 23826 1726867454.54475: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867454.54480: Calling groups_plugins_play to load vars for managed_node2 23826 1726867454.56286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867454.57875: done with get_vars() 23826 1726867454.57896: variable 'ansible_search_path' from source: unknown 23826 1726867454.57897: variable 'ansible_search_path' from source: unknown 23826 1726867454.57936: we have included files to process 23826 1726867454.57938: generating all_blocks data 23826 1726867454.57939: done generating all_blocks data 23826 1726867454.57945: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 23826 1726867454.57946: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 23826 1726867454.57949: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 23826 1726867454.59017: done processing included file 23826 1726867454.59019: iterating over new_blocks loaded from include file 23826 1726867454.59021: in VariableManager get_vars() 23826 1726867454.59039: done with get_vars() 23826 1726867454.59041: filtering new block on tags 23826 1726867454.59065: done filtering new block on tags 23826 1726867454.59068: in VariableManager get_vars() 23826 1726867454.59081: done with get_vars() 23826 1726867454.59083: filtering new block on tags 23826 1726867454.59104: done filtering new block on tags 23826 1726867454.59106: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 23826 1726867454.59111: extending task lists for all hosts with included blocks 23826 1726867454.59212: done extending task lists 23826 1726867454.59214: done processing included files 23826 1726867454.59214: results queue empty 23826 1726867454.59215: checking for any_errors_fatal 23826 1726867454.59218: done checking for any_errors_fatal 23826 1726867454.59219: checking for max_fail_percentage 23826 1726867454.59220: done checking for max_fail_percentage 23826 1726867454.59220: checking to see if all hosts have failed and the running result is not ok 23826 1726867454.59221: done checking to see if all hosts have failed 23826 1726867454.59222: getting the remaining hosts for this loop 23826 1726867454.59223: done getting the remaining hosts for this loop 23826 1726867454.59226: getting the next task for host managed_node2 23826 1726867454.59230: done getting next task for host managed_node2 23826 1726867454.59232: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 23826 1726867454.59235: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867454.59237: getting variables 23826 1726867454.59238: in VariableManager get_vars() 23826 1726867454.59300: Calling all_inventory to load vars for managed_node2 23826 1726867454.59303: Calling groups_inventory to load vars for managed_node2 23826 1726867454.59306: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867454.59311: Calling all_plugins_play to load vars for managed_node2 23826 1726867454.59313: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867454.59316: Calling groups_plugins_play to load vars for managed_node2 23826 1726867454.60416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867454.62008: done with get_vars() 23826 1726867454.62029: done getting variables 23826 1726867454.62076: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:24:14 -0400 (0:00:00.093) 0:00:36.632 ****** 23826 1726867454.62108: entering _queue_task() for managed_node2/set_fact 23826 1726867454.62451: worker is 1 (out of 1 available) 23826 1726867454.62463: exiting _queue_task() for managed_node2/set_fact 23826 1726867454.62476: done queuing things up, now waiting for results queue to drain 23826 1726867454.62479: waiting for pending results... 23826 1726867454.62843: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 23826 1726867454.62899: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000062a 23826 1726867454.62937: variable 'ansible_search_path' from source: unknown 23826 1726867454.62940: variable 'ansible_search_path' from source: unknown 23826 1726867454.62969: calling self._execute() 23826 1726867454.63283: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.63286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.63289: variable 'omit' from source: magic vars 23826 1726867454.63461: variable 'ansible_distribution_major_version' from source: facts 23826 1726867454.63480: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867454.63492: variable 'omit' from source: magic vars 23826 1726867454.63547: variable 'omit' from source: magic vars 23826 1726867454.63589: variable 'omit' from source: magic vars 23826 1726867454.63637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867454.63673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867454.63698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867454.63718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867454.63739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867454.63770: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867454.63780: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.63787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.63906: Set connection var ansible_timeout to 10 23826 1726867454.63921: Set connection var ansible_shell_executable to /bin/sh 23826 1726867454.63930: Set connection var ansible_connection to ssh 23826 1726867454.63950: Set connection var ansible_pipelining to False 23826 1726867454.63958: Set connection var ansible_shell_type to sh 23826 1726867454.63968: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867454.64002: variable 'ansible_shell_executable' from source: unknown 23826 1726867454.64011: variable 'ansible_connection' from source: unknown 23826 1726867454.64019: variable 'ansible_module_compression' from source: unknown 23826 1726867454.64026: variable 'ansible_shell_type' from source: unknown 23826 1726867454.64032: variable 'ansible_shell_executable' from source: unknown 23826 1726867454.64038: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.64047: variable 'ansible_pipelining' from source: unknown 23826 1726867454.64062: variable 'ansible_timeout' from source: unknown 23826 1726867454.64070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.64281: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867454.64285: variable 'omit' from source: magic vars 23826 1726867454.64287: starting attempt loop 23826 1726867454.64289: running the handler 23826 1726867454.64292: handler run complete 23826 1726867454.64294: attempt loop complete, returning result 23826 1726867454.64296: _execute() done 23826 1726867454.64298: dumping result to json 23826 1726867454.64300: done dumping result, returning 23826 1726867454.64312: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-a92d-a3ea-00000000062a] 23826 1726867454.64322: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062a ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 23826 1726867454.64576: no more pending results, returning what we have 23826 1726867454.64588: results queue empty 23826 1726867454.64590: checking for any_errors_fatal 23826 1726867454.64592: done checking for any_errors_fatal 23826 1726867454.64593: checking for max_fail_percentage 23826 1726867454.64595: done checking for max_fail_percentage 23826 1726867454.64596: checking to see if all hosts have failed and the running result is not ok 23826 1726867454.64597: done checking to see if all hosts have failed 23826 1726867454.64598: getting the remaining hosts for this loop 23826 1726867454.64600: done getting the remaining hosts for this loop 23826 1726867454.64604: getting the next task for host managed_node2 23826 1726867454.64612: done getting next task for host managed_node2 23826 1726867454.64614: ^ task is: TASK: Stat profile file 23826 1726867454.64619: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867454.64630: getting variables 23826 1726867454.64632: in VariableManager get_vars() 23826 1726867454.64664: Calling all_inventory to load vars for managed_node2 23826 1726867454.64668: Calling groups_inventory to load vars for managed_node2 23826 1726867454.64671: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867454.64685: Calling all_plugins_play to load vars for managed_node2 23826 1726867454.64688: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867454.64692: Calling groups_plugins_play to load vars for managed_node2 23826 1726867454.65359: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062a 23826 1726867454.65362: WORKER PROCESS EXITING 23826 1726867454.70673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867454.72236: done with get_vars() 23826 1726867454.72264: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:24:14 -0400 (0:00:00.102) 0:00:36.734 ****** 23826 1726867454.72342: entering _queue_task() for managed_node2/stat 23826 1726867454.72889: worker is 1 (out of 1 available) 23826 1726867454.72900: exiting _queue_task() for managed_node2/stat 23826 1726867454.72909: done queuing things up, now waiting for results queue to drain 23826 1726867454.72910: waiting for pending results... 23826 1726867454.73002: running TaskExecutor() for managed_node2/TASK: Stat profile file 23826 1726867454.73152: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000062b 23826 1726867454.73173: variable 'ansible_search_path' from source: unknown 23826 1726867454.73184: variable 'ansible_search_path' from source: unknown 23826 1726867454.73224: calling self._execute() 23826 1726867454.73330: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.73344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.73367: variable 'omit' from source: magic vars 23826 1726867454.73754: variable 'ansible_distribution_major_version' from source: facts 23826 1726867454.73771: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867454.73790: variable 'omit' from source: magic vars 23826 1726867454.73842: variable 'omit' from source: magic vars 23826 1726867454.73947: variable 'profile' from source: include params 23826 1726867454.73958: variable 'interface' from source: set_fact 23826 1726867454.74036: variable 'interface' from source: set_fact 23826 1726867454.74062: variable 'omit' from source: magic vars 23826 1726867454.74106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867454.74153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867454.74179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867454.74229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867454.74232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867454.74256: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867454.74265: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.74274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.74446: Set connection var ansible_timeout to 10 23826 1726867454.74449: Set connection var ansible_shell_executable to /bin/sh 23826 1726867454.74452: Set connection var ansible_connection to ssh 23826 1726867454.74455: Set connection var ansible_pipelining to False 23826 1726867454.74457: Set connection var ansible_shell_type to sh 23826 1726867454.74459: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867454.74461: variable 'ansible_shell_executable' from source: unknown 23826 1726867454.74464: variable 'ansible_connection' from source: unknown 23826 1726867454.74466: variable 'ansible_module_compression' from source: unknown 23826 1726867454.74468: variable 'ansible_shell_type' from source: unknown 23826 1726867454.74471: variable 'ansible_shell_executable' from source: unknown 23826 1726867454.74476: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867454.74488: variable 'ansible_pipelining' from source: unknown 23826 1726867454.74496: variable 'ansible_timeout' from source: unknown 23826 1726867454.74504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867454.74712: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867454.74728: variable 'omit' from source: magic vars 23826 1726867454.74738: starting attempt loop 23826 1726867454.74744: running the handler 23826 1726867454.74772: _low_level_execute_command(): starting 23826 1726867454.74882: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867454.75535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867454.75552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867454.75618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867454.75640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.75713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.77404: stdout chunk (state=3): >>>/root <<< 23826 1726867454.77505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.77531: stderr chunk (state=3): >>><<< 23826 1726867454.77539: stdout chunk (state=3): >>><<< 23826 1726867454.77560: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867454.77576: _low_level_execute_command(): starting 23826 1726867454.77590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775 `" && echo ansible-tmp-1726867454.7756462-25602-258298897259775="` echo /root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775 `" ) && sleep 0' 23826 1726867454.78097: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867454.78121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867454.78137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.78155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.78225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.80210: stdout chunk (state=3): >>>ansible-tmp-1726867454.7756462-25602-258298897259775=/root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775 <<< 23826 1726867454.80314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.80337: stderr chunk (state=3): >>><<< 23826 1726867454.80344: stdout chunk (state=3): >>><<< 23826 1726867454.80357: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867454.7756462-25602-258298897259775=/root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867454.80393: variable 'ansible_module_compression' from source: unknown 23826 1726867454.80439: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 23826 1726867454.80471: variable 'ansible_facts' from source: unknown 23826 1726867454.80532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/AnsiballZ_stat.py 23826 1726867454.80669: Sending initial data 23826 1726867454.80672: Sent initial data (153 bytes) 23826 1726867454.81255: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867454.81310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.81356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.81418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.83041: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 23826 1726867454.83069: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867454.83105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867454.83154: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpav5x3x62 /root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/AnsiballZ_stat.py <<< 23826 1726867454.83157: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/AnsiballZ_stat.py" <<< 23826 1726867454.83211: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpav5x3x62" to remote "/root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/AnsiballZ_stat.py" <<< 23826 1726867454.83924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.83972: stderr chunk (state=3): >>><<< 23826 1726867454.83983: stdout chunk (state=3): >>><<< 23826 1726867454.84129: done transferring module to remote 23826 1726867454.84132: _low_level_execute_command(): starting 23826 1726867454.84135: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/ /root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/AnsiballZ_stat.py && sleep 0' 23826 1726867454.84671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867454.84688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867454.84792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867454.84823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867454.84839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.84860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.84934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867454.86780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867454.86801: stdout chunk (state=3): >>><<< 23826 1726867454.86813: stderr chunk (state=3): >>><<< 23826 1726867454.86835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867454.86917: _low_level_execute_command(): starting 23826 1726867454.86921: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/AnsiballZ_stat.py && sleep 0' 23826 1726867454.87472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867454.87549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867454.87599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867454.87615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867454.87636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867454.87723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867455.03018: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 23826 1726867455.04395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867455.04422: stderr chunk (state=3): >>><<< 23826 1726867455.04426: stdout chunk (state=3): >>><<< 23826 1726867455.04446: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867455.04471: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867455.04480: _low_level_execute_command(): starting 23826 1726867455.04486: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867454.7756462-25602-258298897259775/ > /dev/null 2>&1 && sleep 0' 23826 1726867455.05055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867455.05058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 23826 1726867455.05061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.05102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867455.05137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867455.06976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867455.07000: stderr chunk (state=3): >>><<< 23826 1726867455.07003: stdout chunk (state=3): >>><<< 23826 1726867455.07019: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867455.07027: handler run complete 23826 1726867455.07041: attempt loop complete, returning result 23826 1726867455.07044: _execute() done 23826 1726867455.07047: dumping result to json 23826 1726867455.07049: done dumping result, returning 23826 1726867455.07056: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affcac9-a3a5-a92d-a3ea-00000000062b] 23826 1726867455.07059: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062b 23826 1726867455.07151: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062b 23826 1726867455.07153: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 23826 1726867455.07231: no more pending results, returning what we have 23826 1726867455.07234: results queue empty 23826 1726867455.07235: checking for any_errors_fatal 23826 1726867455.07242: done checking for any_errors_fatal 23826 1726867455.07242: checking for max_fail_percentage 23826 1726867455.07244: done checking for max_fail_percentage 23826 1726867455.07245: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.07246: done checking to see if all hosts have failed 23826 1726867455.07247: getting the remaining hosts for this loop 23826 1726867455.07248: done getting the remaining hosts for this loop 23826 1726867455.07252: getting the next task for host managed_node2 23826 1726867455.07259: done getting next task for host managed_node2 23826 1726867455.07263: ^ task is: TASK: Set NM profile exist flag based on the profile files 23826 1726867455.07267: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.07271: getting variables 23826 1726867455.07273: in VariableManager get_vars() 23826 1726867455.07304: Calling all_inventory to load vars for managed_node2 23826 1726867455.07307: Calling groups_inventory to load vars for managed_node2 23826 1726867455.07310: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.07321: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.07323: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.07326: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.08138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.09427: done with get_vars() 23826 1726867455.09453: done getting variables 23826 1726867455.09499: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:24:15 -0400 (0:00:00.371) 0:00:37.106 ****** 23826 1726867455.09525: entering _queue_task() for managed_node2/set_fact 23826 1726867455.09748: worker is 1 (out of 1 available) 23826 1726867455.09759: exiting _queue_task() for managed_node2/set_fact 23826 1726867455.09769: done queuing things up, now waiting for results queue to drain 23826 1726867455.09771: waiting for pending results... 23826 1726867455.09962: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 23826 1726867455.10054: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000062c 23826 1726867455.10067: variable 'ansible_search_path' from source: unknown 23826 1726867455.10071: variable 'ansible_search_path' from source: unknown 23826 1726867455.10104: calling self._execute() 23826 1726867455.10179: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.10183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.10193: variable 'omit' from source: magic vars 23826 1726867455.10469: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.10479: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.10564: variable 'profile_stat' from source: set_fact 23826 1726867455.10575: Evaluated conditional (profile_stat.stat.exists): False 23826 1726867455.10579: when evaluation is False, skipping this task 23826 1726867455.10582: _execute() done 23826 1726867455.10585: dumping result to json 23826 1726867455.10588: done dumping result, returning 23826 1726867455.10594: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-a92d-a3ea-00000000062c] 23826 1726867455.10598: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062c 23826 1726867455.10679: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062c 23826 1726867455.10682: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 23826 1726867455.10728: no more pending results, returning what we have 23826 1726867455.10731: results queue empty 23826 1726867455.10732: checking for any_errors_fatal 23826 1726867455.10739: done checking for any_errors_fatal 23826 1726867455.10740: checking for max_fail_percentage 23826 1726867455.10741: done checking for max_fail_percentage 23826 1726867455.10742: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.10743: done checking to see if all hosts have failed 23826 1726867455.10744: getting the remaining hosts for this loop 23826 1726867455.10745: done getting the remaining hosts for this loop 23826 1726867455.10748: getting the next task for host managed_node2 23826 1726867455.10755: done getting next task for host managed_node2 23826 1726867455.10757: ^ task is: TASK: Get NM profile info 23826 1726867455.10760: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.10763: getting variables 23826 1726867455.10764: in VariableManager get_vars() 23826 1726867455.10792: Calling all_inventory to load vars for managed_node2 23826 1726867455.10795: Calling groups_inventory to load vars for managed_node2 23826 1726867455.10797: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.10807: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.10809: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.10812: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.11662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.12531: done with get_vars() 23826 1726867455.12544: done getting variables 23826 1726867455.12585: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:24:15 -0400 (0:00:00.030) 0:00:37.137 ****** 23826 1726867455.12605: entering _queue_task() for managed_node2/shell 23826 1726867455.12809: worker is 1 (out of 1 available) 23826 1726867455.12824: exiting _queue_task() for managed_node2/shell 23826 1726867455.12834: done queuing things up, now waiting for results queue to drain 23826 1726867455.12835: waiting for pending results... 23826 1726867455.12996: running TaskExecutor() for managed_node2/TASK: Get NM profile info 23826 1726867455.13075: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000062d 23826 1726867455.13090: variable 'ansible_search_path' from source: unknown 23826 1726867455.13093: variable 'ansible_search_path' from source: unknown 23826 1726867455.13120: calling self._execute() 23826 1726867455.13191: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.13195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.13204: variable 'omit' from source: magic vars 23826 1726867455.13466: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.13475: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.13482: variable 'omit' from source: magic vars 23826 1726867455.13519: variable 'omit' from source: magic vars 23826 1726867455.13586: variable 'profile' from source: include params 23826 1726867455.13590: variable 'interface' from source: set_fact 23826 1726867455.13641: variable 'interface' from source: set_fact 23826 1726867455.13656: variable 'omit' from source: magic vars 23826 1726867455.13692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867455.13723: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867455.13737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867455.13750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867455.13759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867455.13783: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867455.13786: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.13789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.13859: Set connection var ansible_timeout to 10 23826 1726867455.13866: Set connection var ansible_shell_executable to /bin/sh 23826 1726867455.13869: Set connection var ansible_connection to ssh 23826 1726867455.13876: Set connection var ansible_pipelining to False 23826 1726867455.13880: Set connection var ansible_shell_type to sh 23826 1726867455.13885: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867455.13903: variable 'ansible_shell_executable' from source: unknown 23826 1726867455.13905: variable 'ansible_connection' from source: unknown 23826 1726867455.13910: variable 'ansible_module_compression' from source: unknown 23826 1726867455.13913: variable 'ansible_shell_type' from source: unknown 23826 1726867455.13915: variable 'ansible_shell_executable' from source: unknown 23826 1726867455.13918: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.13920: variable 'ansible_pipelining' from source: unknown 23826 1726867455.13923: variable 'ansible_timeout' from source: unknown 23826 1726867455.13925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.14021: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867455.14031: variable 'omit' from source: magic vars 23826 1726867455.14037: starting attempt loop 23826 1726867455.14041: running the handler 23826 1726867455.14051: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867455.14066: _low_level_execute_command(): starting 23826 1726867455.14072: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867455.14556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867455.14588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867455.14592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.14595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867455.14597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.14653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867455.14656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867455.14658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867455.14707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867455.16396: stdout chunk (state=3): >>>/root <<< 23826 1726867455.16494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867455.16521: stderr chunk (state=3): >>><<< 23826 1726867455.16525: stdout chunk (state=3): >>><<< 23826 1726867455.16550: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867455.16560: _low_level_execute_command(): starting 23826 1726867455.16567: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315 `" && echo ansible-tmp-1726867455.1654897-25624-180867214901315="` echo /root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315 `" ) && sleep 0' 23826 1726867455.16981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867455.17012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.17016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867455.17018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867455.17020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.17067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867455.17070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867455.17119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867455.19067: stdout chunk (state=3): >>>ansible-tmp-1726867455.1654897-25624-180867214901315=/root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315 <<< 23826 1726867455.19176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867455.19202: stderr chunk (state=3): >>><<< 23826 1726867455.19205: stdout chunk (state=3): >>><<< 23826 1726867455.19221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867455.1654897-25624-180867214901315=/root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867455.19248: variable 'ansible_module_compression' from source: unknown 23826 1726867455.19290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867455.19320: variable 'ansible_facts' from source: unknown 23826 1726867455.19375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/AnsiballZ_command.py 23826 1726867455.19475: Sending initial data 23826 1726867455.19480: Sent initial data (156 bytes) 23826 1726867455.19905: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867455.19911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867455.19913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867455.19915: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867455.19917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.19960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867455.19963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867455.20009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867455.21594: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 23826 1726867455.21600: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867455.21632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867455.21672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpqisa7f3b /root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/AnsiballZ_command.py <<< 23826 1726867455.21676: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/AnsiballZ_command.py" <<< 23826 1726867455.21708: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpqisa7f3b" to remote "/root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/AnsiballZ_command.py" <<< 23826 1726867455.21716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/AnsiballZ_command.py" <<< 23826 1726867455.22247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867455.22258: stderr chunk (state=3): >>><<< 23826 1726867455.22261: stdout chunk (state=3): >>><<< 23826 1726867455.22304: done transferring module to remote 23826 1726867455.22314: _low_level_execute_command(): starting 23826 1726867455.22318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/ /root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/AnsiballZ_command.py && sleep 0' 23826 1726867455.22763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867455.22771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.22773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867455.22775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867455.22780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.22825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867455.22828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867455.22835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867455.22875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867455.24689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867455.24714: stderr chunk (state=3): >>><<< 23826 1726867455.24717: stdout chunk (state=3): >>><<< 23826 1726867455.24730: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867455.24733: _low_level_execute_command(): starting 23826 1726867455.24738: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/AnsiballZ_command.py && sleep 0' 23826 1726867455.25171: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867455.25174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867455.25176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.25180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867455.25182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.25241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867455.25245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867455.25248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867455.25287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867455.42529: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 17:24:15.407653", "end": "2024-09-20 17:24:15.423757", "delta": "0:00:00.016104", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867455.44115: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.12.116 closed. <<< 23826 1726867455.44119: stdout chunk (state=3): >>><<< 23826 1726867455.44122: stderr chunk (state=3): >>><<< 23826 1726867455.44142: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 17:24:15.407653", "end": "2024-09-20 17:24:15.423757", "delta": "0:00:00.016104", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.12.116 closed. 23826 1726867455.44184: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867455.44219: _low_level_execute_command(): starting 23826 1726867455.44223: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867455.1654897-25624-180867214901315/ > /dev/null 2>&1 && sleep 0' 23826 1726867455.44838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867455.44853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867455.44867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867455.44886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867455.44902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867455.44944: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 23826 1726867455.44957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867455.45030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867455.45054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867455.45079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867455.45150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867455.47052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867455.47055: stdout chunk (state=3): >>><<< 23826 1726867455.47057: stderr chunk (state=3): >>><<< 23826 1726867455.47073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867455.47088: handler run complete 23826 1726867455.47283: Evaluated conditional (False): False 23826 1726867455.47287: attempt loop complete, returning result 23826 1726867455.47289: _execute() done 23826 1726867455.47291: dumping result to json 23826 1726867455.47293: done dumping result, returning 23826 1726867455.47295: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affcac9-a3a5-a92d-a3ea-00000000062d] 23826 1726867455.47297: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062d 23826 1726867455.47371: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062d 23826 1726867455.47374: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.016104", "end": "2024-09-20 17:24:15.423757", "rc": 1, "start": "2024-09-20 17:24:15.407653" } MSG: non-zero return code ...ignoring 23826 1726867455.47454: no more pending results, returning what we have 23826 1726867455.47458: results queue empty 23826 1726867455.47458: checking for any_errors_fatal 23826 1726867455.47465: done checking for any_errors_fatal 23826 1726867455.47465: checking for max_fail_percentage 23826 1726867455.47467: done checking for max_fail_percentage 23826 1726867455.47468: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.47469: done checking to see if all hosts have failed 23826 1726867455.47470: getting the remaining hosts for this loop 23826 1726867455.47471: done getting the remaining hosts for this loop 23826 1726867455.47474: getting the next task for host managed_node2 23826 1726867455.47483: done getting next task for host managed_node2 23826 1726867455.47489: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 23826 1726867455.47493: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.47497: getting variables 23826 1726867455.47498: in VariableManager get_vars() 23826 1726867455.47528: Calling all_inventory to load vars for managed_node2 23826 1726867455.47530: Calling groups_inventory to load vars for managed_node2 23826 1726867455.47534: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.47544: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.47547: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.47549: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.49207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.51235: done with get_vars() 23826 1726867455.51258: done getting variables 23826 1726867455.51327: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:24:15 -0400 (0:00:00.387) 0:00:37.524 ****** 23826 1726867455.51361: entering _queue_task() for managed_node2/set_fact 23826 1726867455.51925: worker is 1 (out of 1 available) 23826 1726867455.51939: exiting _queue_task() for managed_node2/set_fact 23826 1726867455.51956: done queuing things up, now waiting for results queue to drain 23826 1726867455.51958: waiting for pending results... 23826 1726867455.52194: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 23826 1726867455.52200: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000062e 23826 1726867455.52205: variable 'ansible_search_path' from source: unknown 23826 1726867455.52217: variable 'ansible_search_path' from source: unknown 23826 1726867455.52259: calling self._execute() 23826 1726867455.52361: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.52372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.52388: variable 'omit' from source: magic vars 23826 1726867455.52766: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.52784: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.52921: variable 'nm_profile_exists' from source: set_fact 23826 1726867455.52945: Evaluated conditional (nm_profile_exists.rc == 0): False 23826 1726867455.53051: when evaluation is False, skipping this task 23826 1726867455.53055: _execute() done 23826 1726867455.53057: dumping result to json 23826 1726867455.53059: done dumping result, returning 23826 1726867455.53062: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-a92d-a3ea-00000000062e] 23826 1726867455.53064: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062e 23826 1726867455.53133: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000062e 23826 1726867455.53136: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 23826 1726867455.53201: no more pending results, returning what we have 23826 1726867455.53205: results queue empty 23826 1726867455.53206: checking for any_errors_fatal 23826 1726867455.53219: done checking for any_errors_fatal 23826 1726867455.53220: checking for max_fail_percentage 23826 1726867455.53222: done checking for max_fail_percentage 23826 1726867455.53223: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.53224: done checking to see if all hosts have failed 23826 1726867455.53225: getting the remaining hosts for this loop 23826 1726867455.53226: done getting the remaining hosts for this loop 23826 1726867455.53229: getting the next task for host managed_node2 23826 1726867455.53238: done getting next task for host managed_node2 23826 1726867455.53241: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 23826 1726867455.53245: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.53248: getting variables 23826 1726867455.53250: in VariableManager get_vars() 23826 1726867455.53282: Calling all_inventory to load vars for managed_node2 23826 1726867455.53284: Calling groups_inventory to load vars for managed_node2 23826 1726867455.53288: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.53300: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.53303: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.53306: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.54820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.57489: done with get_vars() 23826 1726867455.57519: done getting variables 23826 1726867455.57576: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867455.57698: variable 'profile' from source: include params 23826 1726867455.57703: variable 'interface' from source: set_fact 23826 1726867455.57768: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:24:15 -0400 (0:00:00.064) 0:00:37.589 ****** 23826 1726867455.57803: entering _queue_task() for managed_node2/command 23826 1726867455.58219: worker is 1 (out of 1 available) 23826 1726867455.58230: exiting _queue_task() for managed_node2/command 23826 1726867455.58240: done queuing things up, now waiting for results queue to drain 23826 1726867455.58241: waiting for pending results... 23826 1726867455.58432: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 23826 1726867455.58783: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000630 23826 1726867455.58787: variable 'ansible_search_path' from source: unknown 23826 1726867455.58789: variable 'ansible_search_path' from source: unknown 23826 1726867455.58792: calling self._execute() 23826 1726867455.58795: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.58797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.58799: variable 'omit' from source: magic vars 23826 1726867455.59192: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.59211: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.59336: variable 'profile_stat' from source: set_fact 23826 1726867455.59359: Evaluated conditional (profile_stat.stat.exists): False 23826 1726867455.59366: when evaluation is False, skipping this task 23826 1726867455.59373: _execute() done 23826 1726867455.59383: dumping result to json 23826 1726867455.59390: done dumping result, returning 23826 1726867455.59400: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [0affcac9-a3a5-a92d-a3ea-000000000630] 23826 1726867455.59412: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000630 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 23826 1726867455.59561: no more pending results, returning what we have 23826 1726867455.59564: results queue empty 23826 1726867455.59566: checking for any_errors_fatal 23826 1726867455.59572: done checking for any_errors_fatal 23826 1726867455.59573: checking for max_fail_percentage 23826 1726867455.59575: done checking for max_fail_percentage 23826 1726867455.59576: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.59580: done checking to see if all hosts have failed 23826 1726867455.59581: getting the remaining hosts for this loop 23826 1726867455.59583: done getting the remaining hosts for this loop 23826 1726867455.59587: getting the next task for host managed_node2 23826 1726867455.59595: done getting next task for host managed_node2 23826 1726867455.59597: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 23826 1726867455.59601: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.59606: getting variables 23826 1726867455.59610: in VariableManager get_vars() 23826 1726867455.59639: Calling all_inventory to load vars for managed_node2 23826 1726867455.59642: Calling groups_inventory to load vars for managed_node2 23826 1726867455.59645: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.59661: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.59664: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.59668: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.60313: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000630 23826 1726867455.60316: WORKER PROCESS EXITING 23826 1726867455.61468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.64390: done with get_vars() 23826 1726867455.64423: done getting variables 23826 1726867455.64619: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867455.64733: variable 'profile' from source: include params 23826 1726867455.64738: variable 'interface' from source: set_fact 23826 1726867455.64796: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:24:15 -0400 (0:00:00.070) 0:00:37.659 ****** 23826 1726867455.64832: entering _queue_task() for managed_node2/set_fact 23826 1726867455.65156: worker is 1 (out of 1 available) 23826 1726867455.65168: exiting _queue_task() for managed_node2/set_fact 23826 1726867455.65283: done queuing things up, now waiting for results queue to drain 23826 1726867455.65284: waiting for pending results... 23826 1726867455.65460: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 23826 1726867455.65597: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000631 23826 1726867455.65624: variable 'ansible_search_path' from source: unknown 23826 1726867455.65631: variable 'ansible_search_path' from source: unknown 23826 1726867455.65670: calling self._execute() 23826 1726867455.65773: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.65787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.65804: variable 'omit' from source: magic vars 23826 1726867455.66179: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.66197: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.66336: variable 'profile_stat' from source: set_fact 23826 1726867455.66357: Evaluated conditional (profile_stat.stat.exists): False 23826 1726867455.66365: when evaluation is False, skipping this task 23826 1726867455.66378: _execute() done 23826 1726867455.66386: dumping result to json 23826 1726867455.66393: done dumping result, returning 23826 1726867455.66403: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [0affcac9-a3a5-a92d-a3ea-000000000631] 23826 1726867455.66584: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000631 23826 1726867455.66651: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000631 23826 1726867455.66655: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 23826 1726867455.66701: no more pending results, returning what we have 23826 1726867455.66705: results queue empty 23826 1726867455.66706: checking for any_errors_fatal 23826 1726867455.66716: done checking for any_errors_fatal 23826 1726867455.66717: checking for max_fail_percentage 23826 1726867455.66718: done checking for max_fail_percentage 23826 1726867455.66719: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.66720: done checking to see if all hosts have failed 23826 1726867455.66721: getting the remaining hosts for this loop 23826 1726867455.66722: done getting the remaining hosts for this loop 23826 1726867455.66726: getting the next task for host managed_node2 23826 1726867455.66732: done getting next task for host managed_node2 23826 1726867455.66734: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 23826 1726867455.66738: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.66743: getting variables 23826 1726867455.66745: in VariableManager get_vars() 23826 1726867455.66774: Calling all_inventory to load vars for managed_node2 23826 1726867455.66779: Calling groups_inventory to load vars for managed_node2 23826 1726867455.66783: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.66795: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.66798: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.66801: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.68270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.70828: done with get_vars() 23826 1726867455.70866: done getting variables 23826 1726867455.71238: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867455.71365: variable 'profile' from source: include params 23826 1726867455.71370: variable 'interface' from source: set_fact 23826 1726867455.71641: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:24:15 -0400 (0:00:00.068) 0:00:37.728 ****** 23826 1726867455.71679: entering _queue_task() for managed_node2/command 23826 1726867455.72353: worker is 1 (out of 1 available) 23826 1726867455.72365: exiting _queue_task() for managed_node2/command 23826 1726867455.72481: done queuing things up, now waiting for results queue to drain 23826 1726867455.72483: waiting for pending results... 23826 1726867455.72871: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 23826 1726867455.73138: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000632 23826 1726867455.73245: variable 'ansible_search_path' from source: unknown 23826 1726867455.73249: variable 'ansible_search_path' from source: unknown 23826 1726867455.73256: calling self._execute() 23826 1726867455.73534: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.73683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.73687: variable 'omit' from source: magic vars 23826 1726867455.74565: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.74569: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.74712: variable 'profile_stat' from source: set_fact 23826 1726867455.74733: Evaluated conditional (profile_stat.stat.exists): False 23826 1726867455.74790: when evaluation is False, skipping this task 23826 1726867455.74798: _execute() done 23826 1726867455.74806: dumping result to json 23826 1726867455.74814: done dumping result, returning 23826 1726867455.74828: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 [0affcac9-a3a5-a92d-a3ea-000000000632] 23826 1726867455.74833: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000632 23826 1726867455.74983: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000632 23826 1726867455.74987: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 23826 1726867455.75050: no more pending results, returning what we have 23826 1726867455.75054: results queue empty 23826 1726867455.75056: checking for any_errors_fatal 23826 1726867455.75062: done checking for any_errors_fatal 23826 1726867455.75063: checking for max_fail_percentage 23826 1726867455.75065: done checking for max_fail_percentage 23826 1726867455.75066: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.75067: done checking to see if all hosts have failed 23826 1726867455.75067: getting the remaining hosts for this loop 23826 1726867455.75069: done getting the remaining hosts for this loop 23826 1726867455.75073: getting the next task for host managed_node2 23826 1726867455.75084: done getting next task for host managed_node2 23826 1726867455.75087: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 23826 1726867455.75091: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.75095: getting variables 23826 1726867455.75097: in VariableManager get_vars() 23826 1726867455.75250: Calling all_inventory to load vars for managed_node2 23826 1726867455.75254: Calling groups_inventory to load vars for managed_node2 23826 1726867455.75258: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.75270: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.75272: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.75274: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.77648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.79450: done with get_vars() 23826 1726867455.79482: done getting variables 23826 1726867455.79544: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867455.79663: variable 'profile' from source: include params 23826 1726867455.79667: variable 'interface' from source: set_fact 23826 1726867455.79735: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:24:15 -0400 (0:00:00.080) 0:00:37.809 ****** 23826 1726867455.79767: entering _queue_task() for managed_node2/set_fact 23826 1726867455.80474: worker is 1 (out of 1 available) 23826 1726867455.80487: exiting _queue_task() for managed_node2/set_fact 23826 1726867455.80498: done queuing things up, now waiting for results queue to drain 23826 1726867455.80499: waiting for pending results... 23826 1726867455.80740: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 23826 1726867455.80984: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000633 23826 1726867455.80993: variable 'ansible_search_path' from source: unknown 23826 1726867455.80997: variable 'ansible_search_path' from source: unknown 23826 1726867455.81000: calling self._execute() 23826 1726867455.81065: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.81079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.81100: variable 'omit' from source: magic vars 23826 1726867455.81583: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.81587: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.81774: variable 'profile_stat' from source: set_fact 23826 1726867455.81806: Evaluated conditional (profile_stat.stat.exists): False 23826 1726867455.81862: when evaluation is False, skipping this task 23826 1726867455.81865: _execute() done 23826 1726867455.81871: dumping result to json 23826 1726867455.81874: done dumping result, returning 23826 1726867455.81878: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [0affcac9-a3a5-a92d-a3ea-000000000633] 23826 1726867455.81881: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000633 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 23826 1726867455.82029: no more pending results, returning what we have 23826 1726867455.82033: results queue empty 23826 1726867455.82034: checking for any_errors_fatal 23826 1726867455.82041: done checking for any_errors_fatal 23826 1726867455.82042: checking for max_fail_percentage 23826 1726867455.82044: done checking for max_fail_percentage 23826 1726867455.82045: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.82046: done checking to see if all hosts have failed 23826 1726867455.82046: getting the remaining hosts for this loop 23826 1726867455.82048: done getting the remaining hosts for this loop 23826 1726867455.82052: getting the next task for host managed_node2 23826 1726867455.82061: done getting next task for host managed_node2 23826 1726867455.82064: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 23826 1726867455.82067: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.82073: getting variables 23826 1726867455.82075: in VariableManager get_vars() 23826 1726867455.82105: Calling all_inventory to load vars for managed_node2 23826 1726867455.82110: Calling groups_inventory to load vars for managed_node2 23826 1726867455.82114: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.82126: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.82129: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.82132: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.82792: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000633 23826 1726867455.82796: WORKER PROCESS EXITING 23826 1726867455.84130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.86684: done with get_vars() 23826 1726867455.86701: done getting variables 23826 1726867455.86747: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867455.86832: variable 'profile' from source: include params 23826 1726867455.86836: variable 'interface' from source: set_fact 23826 1726867455.86878: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 17:24:15 -0400 (0:00:00.071) 0:00:37.880 ****** 23826 1726867455.86903: entering _queue_task() for managed_node2/assert 23826 1726867455.87139: worker is 1 (out of 1 available) 23826 1726867455.87152: exiting _queue_task() for managed_node2/assert 23826 1726867455.87163: done queuing things up, now waiting for results queue to drain 23826 1726867455.87165: waiting for pending results... 23826 1726867455.87345: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' 23826 1726867455.87419: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000613 23826 1726867455.87431: variable 'ansible_search_path' from source: unknown 23826 1726867455.87434: variable 'ansible_search_path' from source: unknown 23826 1726867455.87461: calling self._execute() 23826 1726867455.87538: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.87542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.87553: variable 'omit' from source: magic vars 23826 1726867455.87854: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.87863: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.87874: variable 'omit' from source: magic vars 23826 1726867455.87993: variable 'omit' from source: magic vars 23826 1726867455.88136: variable 'profile' from source: include params 23826 1726867455.88139: variable 'interface' from source: set_fact 23826 1726867455.88382: variable 'interface' from source: set_fact 23826 1726867455.88419: variable 'omit' from source: magic vars 23826 1726867455.88458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867455.88536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867455.88554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867455.88571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867455.88584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867455.88616: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867455.88619: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.88808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.88925: Set connection var ansible_timeout to 10 23826 1726867455.88936: Set connection var ansible_shell_executable to /bin/sh 23826 1726867455.88939: Set connection var ansible_connection to ssh 23826 1726867455.88948: Set connection var ansible_pipelining to False 23826 1726867455.88950: Set connection var ansible_shell_type to sh 23826 1726867455.88956: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867455.88981: variable 'ansible_shell_executable' from source: unknown 23826 1726867455.89003: variable 'ansible_connection' from source: unknown 23826 1726867455.89006: variable 'ansible_module_compression' from source: unknown 23826 1726867455.89013: variable 'ansible_shell_type' from source: unknown 23826 1726867455.89046: variable 'ansible_shell_executable' from source: unknown 23826 1726867455.89050: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.89054: variable 'ansible_pipelining' from source: unknown 23826 1726867455.89057: variable 'ansible_timeout' from source: unknown 23826 1726867455.89067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.89225: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867455.89283: variable 'omit' from source: magic vars 23826 1726867455.89286: starting attempt loop 23826 1726867455.89289: running the handler 23826 1726867455.89386: variable 'lsr_net_profile_exists' from source: set_fact 23826 1726867455.89392: Evaluated conditional (not lsr_net_profile_exists): True 23826 1726867455.89397: handler run complete 23826 1726867455.89418: attempt loop complete, returning result 23826 1726867455.89422: _execute() done 23826 1726867455.89424: dumping result to json 23826 1726867455.89427: done dumping result, returning 23826 1726867455.89583: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' [0affcac9-a3a5-a92d-a3ea-000000000613] 23826 1726867455.89586: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000613 23826 1726867455.89643: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000613 23826 1726867455.89646: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 23826 1726867455.89716: no more pending results, returning what we have 23826 1726867455.89719: results queue empty 23826 1726867455.89720: checking for any_errors_fatal 23826 1726867455.89729: done checking for any_errors_fatal 23826 1726867455.89730: checking for max_fail_percentage 23826 1726867455.89732: done checking for max_fail_percentage 23826 1726867455.89733: checking to see if all hosts have failed and the running result is not ok 23826 1726867455.89734: done checking to see if all hosts have failed 23826 1726867455.89735: getting the remaining hosts for this loop 23826 1726867455.89736: done getting the remaining hosts for this loop 23826 1726867455.89739: getting the next task for host managed_node2 23826 1726867455.89749: done getting next task for host managed_node2 23826 1726867455.89752: ^ task is: TASK: Include the task 'assert_device_absent.yml' 23826 1726867455.89754: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867455.89757: getting variables 23826 1726867455.89759: in VariableManager get_vars() 23826 1726867455.89788: Calling all_inventory to load vars for managed_node2 23826 1726867455.89790: Calling groups_inventory to load vars for managed_node2 23826 1726867455.89794: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.89804: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.89806: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.89814: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.91233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867455.92903: done with get_vars() 23826 1726867455.92925: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:89 Friday 20 September 2024 17:24:15 -0400 (0:00:00.061) 0:00:37.941 ****** 23826 1726867455.93023: entering _queue_task() for managed_node2/include_tasks 23826 1726867455.93595: worker is 1 (out of 1 available) 23826 1726867455.93608: exiting _queue_task() for managed_node2/include_tasks 23826 1726867455.93620: done queuing things up, now waiting for results queue to drain 23826 1726867455.93621: waiting for pending results... 23826 1726867455.94282: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' 23826 1726867455.94522: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000009e 23826 1726867455.94525: variable 'ansible_search_path' from source: unknown 23826 1726867455.94528: calling self._execute() 23826 1726867455.94552: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867455.94562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867455.94575: variable 'omit' from source: magic vars 23826 1726867455.94973: variable 'ansible_distribution_major_version' from source: facts 23826 1726867455.94996: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867455.95010: _execute() done 23826 1726867455.95020: dumping result to json 23826 1726867455.95030: done dumping result, returning 23826 1726867455.95043: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' [0affcac9-a3a5-a92d-a3ea-00000000009e] 23826 1726867455.95053: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000009e 23826 1726867455.95199: no more pending results, returning what we have 23826 1726867455.95205: in VariableManager get_vars() 23826 1726867455.95246: Calling all_inventory to load vars for managed_node2 23826 1726867455.95249: Calling groups_inventory to load vars for managed_node2 23826 1726867455.95254: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867455.95268: Calling all_plugins_play to load vars for managed_node2 23826 1726867455.95271: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867455.95274: Calling groups_plugins_play to load vars for managed_node2 23826 1726867455.96102: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000009e 23826 1726867455.96106: WORKER PROCESS EXITING 23826 1726867455.97940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867456.01283: done with get_vars() 23826 1726867456.01306: variable 'ansible_search_path' from source: unknown 23826 1726867456.01322: we have included files to process 23826 1726867456.01323: generating all_blocks data 23826 1726867456.01325: done generating all_blocks data 23826 1726867456.01330: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 23826 1726867456.01331: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 23826 1726867456.01334: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 23826 1726867456.01675: in VariableManager get_vars() 23826 1726867456.01698: done with get_vars() 23826 1726867456.01817: done processing included file 23826 1726867456.01820: iterating over new_blocks loaded from include file 23826 1726867456.01822: in VariableManager get_vars() 23826 1726867456.01832: done with get_vars() 23826 1726867456.01834: filtering new block on tags 23826 1726867456.01851: done filtering new block on tags 23826 1726867456.01853: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 23826 1726867456.01857: extending task lists for all hosts with included blocks 23826 1726867456.02115: done extending task lists 23826 1726867456.02116: done processing included files 23826 1726867456.02117: results queue empty 23826 1726867456.02118: checking for any_errors_fatal 23826 1726867456.02121: done checking for any_errors_fatal 23826 1726867456.02122: checking for max_fail_percentage 23826 1726867456.02123: done checking for max_fail_percentage 23826 1726867456.02123: checking to see if all hosts have failed and the running result is not ok 23826 1726867456.02124: done checking to see if all hosts have failed 23826 1726867456.02125: getting the remaining hosts for this loop 23826 1726867456.02127: done getting the remaining hosts for this loop 23826 1726867456.02129: getting the next task for host managed_node2 23826 1726867456.02133: done getting next task for host managed_node2 23826 1726867456.02135: ^ task is: TASK: Include the task 'get_interface_stat.yml' 23826 1726867456.02138: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867456.02140: getting variables 23826 1726867456.02141: in VariableManager get_vars() 23826 1726867456.02150: Calling all_inventory to load vars for managed_node2 23826 1726867456.02153: Calling groups_inventory to load vars for managed_node2 23826 1726867456.02155: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867456.02161: Calling all_plugins_play to load vars for managed_node2 23826 1726867456.02163: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867456.02166: Calling groups_plugins_play to load vars for managed_node2 23826 1726867456.03362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867456.04983: done with get_vars() 23826 1726867456.05004: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:24:16 -0400 (0:00:00.120) 0:00:38.062 ****** 23826 1726867456.05086: entering _queue_task() for managed_node2/include_tasks 23826 1726867456.05438: worker is 1 (out of 1 available) 23826 1726867456.05450: exiting _queue_task() for managed_node2/include_tasks 23826 1726867456.05461: done queuing things up, now waiting for results queue to drain 23826 1726867456.05467: waiting for pending results... 23826 1726867456.05751: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 23826 1726867456.05857: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000664 23826 1726867456.05870: variable 'ansible_search_path' from source: unknown 23826 1726867456.05874: variable 'ansible_search_path' from source: unknown 23826 1726867456.05921: calling self._execute() 23826 1726867456.06022: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.06028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.06042: variable 'omit' from source: magic vars 23826 1726867456.06424: variable 'ansible_distribution_major_version' from source: facts 23826 1726867456.06436: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867456.06447: _execute() done 23826 1726867456.06450: dumping result to json 23826 1726867456.06454: done dumping result, returning 23826 1726867456.06482: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-a92d-a3ea-000000000664] 23826 1726867456.06485: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000664 23826 1726867456.06575: no more pending results, returning what we have 23826 1726867456.06581: in VariableManager get_vars() 23826 1726867456.06616: Calling all_inventory to load vars for managed_node2 23826 1726867456.06618: Calling groups_inventory to load vars for managed_node2 23826 1726867456.06621: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867456.06635: Calling all_plugins_play to load vars for managed_node2 23826 1726867456.06637: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867456.06639: Calling groups_plugins_play to load vars for managed_node2 23826 1726867456.07292: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000664 23826 1726867456.07296: WORKER PROCESS EXITING 23826 1726867456.08091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867456.09711: done with get_vars() 23826 1726867456.09730: variable 'ansible_search_path' from source: unknown 23826 1726867456.09731: variable 'ansible_search_path' from source: unknown 23826 1726867456.09765: we have included files to process 23826 1726867456.09767: generating all_blocks data 23826 1726867456.09768: done generating all_blocks data 23826 1726867456.09769: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 23826 1726867456.09770: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 23826 1726867456.09772: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 23826 1726867456.09956: done processing included file 23826 1726867456.09958: iterating over new_blocks loaded from include file 23826 1726867456.09960: in VariableManager get_vars() 23826 1726867456.09972: done with get_vars() 23826 1726867456.09973: filtering new block on tags 23826 1726867456.09990: done filtering new block on tags 23826 1726867456.09992: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 23826 1726867456.09997: extending task lists for all hosts with included blocks 23826 1726867456.10100: done extending task lists 23826 1726867456.10101: done processing included files 23826 1726867456.10102: results queue empty 23826 1726867456.10103: checking for any_errors_fatal 23826 1726867456.10105: done checking for any_errors_fatal 23826 1726867456.10106: checking for max_fail_percentage 23826 1726867456.10107: done checking for max_fail_percentage 23826 1726867456.10108: checking to see if all hosts have failed and the running result is not ok 23826 1726867456.10108: done checking to see if all hosts have failed 23826 1726867456.10109: getting the remaining hosts for this loop 23826 1726867456.10110: done getting the remaining hosts for this loop 23826 1726867456.10112: getting the next task for host managed_node2 23826 1726867456.10116: done getting next task for host managed_node2 23826 1726867456.10118: ^ task is: TASK: Get stat for interface {{ interface }} 23826 1726867456.10121: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867456.10124: getting variables 23826 1726867456.10125: in VariableManager get_vars() 23826 1726867456.10138: Calling all_inventory to load vars for managed_node2 23826 1726867456.10140: Calling groups_inventory to load vars for managed_node2 23826 1726867456.10142: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867456.10147: Calling all_plugins_play to load vars for managed_node2 23826 1726867456.10150: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867456.10152: Calling groups_plugins_play to load vars for managed_node2 23826 1726867456.11338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867456.13004: done with get_vars() 23826 1726867456.13023: done getting variables 23826 1726867456.13181: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:24:16 -0400 (0:00:00.081) 0:00:38.143 ****** 23826 1726867456.13210: entering _queue_task() for managed_node2/stat 23826 1726867456.13542: worker is 1 (out of 1 available) 23826 1726867456.13553: exiting _queue_task() for managed_node2/stat 23826 1726867456.13564: done queuing things up, now waiting for results queue to drain 23826 1726867456.13565: waiting for pending results... 23826 1726867456.13906: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 23826 1726867456.13982: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000687 23826 1726867456.14010: variable 'ansible_search_path' from source: unknown 23826 1726867456.14018: variable 'ansible_search_path' from source: unknown 23826 1726867456.14058: calling self._execute() 23826 1726867456.14159: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.14170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.14187: variable 'omit' from source: magic vars 23826 1726867456.14723: variable 'ansible_distribution_major_version' from source: facts 23826 1726867456.14726: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867456.14729: variable 'omit' from source: magic vars 23826 1726867456.14732: variable 'omit' from source: magic vars 23826 1726867456.14734: variable 'interface' from source: set_fact 23826 1726867456.14736: variable 'omit' from source: magic vars 23826 1726867456.14787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867456.14827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867456.14858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867456.14884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867456.14899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867456.14931: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867456.14940: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.14947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.15051: Set connection var ansible_timeout to 10 23826 1726867456.15097: Set connection var ansible_shell_executable to /bin/sh 23826 1726867456.15104: Set connection var ansible_connection to ssh 23826 1726867456.15125: Set connection var ansible_pipelining to False 23826 1726867456.15132: Set connection var ansible_shell_type to sh 23826 1726867456.15141: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867456.15165: variable 'ansible_shell_executable' from source: unknown 23826 1726867456.15179: variable 'ansible_connection' from source: unknown 23826 1726867456.15186: variable 'ansible_module_compression' from source: unknown 23826 1726867456.15227: variable 'ansible_shell_type' from source: unknown 23826 1726867456.15231: variable 'ansible_shell_executable' from source: unknown 23826 1726867456.15233: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.15235: variable 'ansible_pipelining' from source: unknown 23826 1726867456.15237: variable 'ansible_timeout' from source: unknown 23826 1726867456.15239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.15446: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 23826 1726867456.15463: variable 'omit' from source: magic vars 23826 1726867456.15503: starting attempt loop 23826 1726867456.15506: running the handler 23826 1726867456.15509: _low_level_execute_command(): starting 23826 1726867456.15518: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867456.16266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867456.16269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867456.16272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.16274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867456.16279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.16316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867456.16331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.16396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.18088: stdout chunk (state=3): >>>/root <<< 23826 1726867456.18246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.18249: stdout chunk (state=3): >>><<< 23826 1726867456.18251: stderr chunk (state=3): >>><<< 23826 1726867456.18269: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867456.18367: _low_level_execute_command(): starting 23826 1726867456.18372: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393 `" && echo ansible-tmp-1726867456.1827438-25662-196808168293393="` echo /root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393 `" ) && sleep 0' 23826 1726867456.18873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867456.18886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867456.18922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867456.18925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.18935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.19018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.19022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.19073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.21004: stdout chunk (state=3): >>>ansible-tmp-1726867456.1827438-25662-196808168293393=/root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393 <<< 23826 1726867456.21110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.21141: stderr chunk (state=3): >>><<< 23826 1726867456.21143: stdout chunk (state=3): >>><<< 23826 1726867456.21187: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867456.1827438-25662-196808168293393=/root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867456.21201: variable 'ansible_module_compression' from source: unknown 23826 1726867456.21250: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 23826 1726867456.21285: variable 'ansible_facts' from source: unknown 23826 1726867456.21350: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/AnsiballZ_stat.py 23826 1726867456.21558: Sending initial data 23826 1726867456.21561: Sent initial data (153 bytes) 23826 1726867456.22102: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867456.22126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.22204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867456.22208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.22288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867456.22295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.22335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.22394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.23951: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867456.23989: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867456.24030: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmp35u3c8w1 /root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/AnsiballZ_stat.py <<< 23826 1726867456.24034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/AnsiballZ_stat.py" <<< 23826 1726867456.24071: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmp35u3c8w1" to remote "/root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/AnsiballZ_stat.py" <<< 23826 1726867456.24592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.24631: stderr chunk (state=3): >>><<< 23826 1726867456.24634: stdout chunk (state=3): >>><<< 23826 1726867456.24656: done transferring module to remote 23826 1726867456.24665: _low_level_execute_command(): starting 23826 1726867456.24669: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/ /root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/AnsiballZ_stat.py && sleep 0' 23826 1726867456.25069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867456.25095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.25098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867456.25100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.25151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.25154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.25202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.26967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.26991: stderr chunk (state=3): >>><<< 23826 1726867456.26994: stdout chunk (state=3): >>><<< 23826 1726867456.27007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867456.27012: _low_level_execute_command(): starting 23826 1726867456.27017: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/AnsiballZ_stat.py && sleep 0' 23826 1726867456.27418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867456.27422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 23826 1726867456.27424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 23826 1726867456.27426: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867456.27428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.27470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867456.27473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.27523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.42876: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 23826 1726867456.44484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867456.44489: stdout chunk (state=3): >>><<< 23826 1726867456.44491: stderr chunk (state=3): >>><<< 23826 1726867456.44493: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867456.44496: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867456.44498: _low_level_execute_command(): starting 23826 1726867456.44499: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867456.1827438-25662-196808168293393/ > /dev/null 2>&1 && sleep 0' 23826 1726867456.45144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867456.45180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867456.45295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867456.45320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.45339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.45427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.47350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.47361: stdout chunk (state=3): >>><<< 23826 1726867456.47373: stderr chunk (state=3): >>><<< 23826 1726867456.47397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867456.47583: handler run complete 23826 1726867456.47587: attempt loop complete, returning result 23826 1726867456.47589: _execute() done 23826 1726867456.47591: dumping result to json 23826 1726867456.47594: done dumping result, returning 23826 1726867456.47597: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [0affcac9-a3a5-a92d-a3ea-000000000687] 23826 1726867456.47600: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000687 23826 1726867456.47682: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000687 23826 1726867456.47686: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 23826 1726867456.47748: no more pending results, returning what we have 23826 1726867456.47751: results queue empty 23826 1726867456.47752: checking for any_errors_fatal 23826 1726867456.47754: done checking for any_errors_fatal 23826 1726867456.47755: checking for max_fail_percentage 23826 1726867456.47756: done checking for max_fail_percentage 23826 1726867456.47757: checking to see if all hosts have failed and the running result is not ok 23826 1726867456.47758: done checking to see if all hosts have failed 23826 1726867456.47759: getting the remaining hosts for this loop 23826 1726867456.47761: done getting the remaining hosts for this loop 23826 1726867456.47765: getting the next task for host managed_node2 23826 1726867456.47773: done getting next task for host managed_node2 23826 1726867456.47775: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 23826 1726867456.47780: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867456.47785: getting variables 23826 1726867456.47786: in VariableManager get_vars() 23826 1726867456.47892: Calling all_inventory to load vars for managed_node2 23826 1726867456.47895: Calling groups_inventory to load vars for managed_node2 23826 1726867456.47898: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867456.47910: Calling all_plugins_play to load vars for managed_node2 23826 1726867456.47971: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867456.47979: Calling groups_plugins_play to load vars for managed_node2 23826 1726867456.49405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867456.51043: done with get_vars() 23826 1726867456.51066: done getting variables 23826 1726867456.51129: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 23826 1726867456.51249: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:24:16 -0400 (0:00:00.380) 0:00:38.524 ****** 23826 1726867456.51283: entering _queue_task() for managed_node2/assert 23826 1726867456.51624: worker is 1 (out of 1 available) 23826 1726867456.51637: exiting _queue_task() for managed_node2/assert 23826 1726867456.51647: done queuing things up, now waiting for results queue to drain 23826 1726867456.51648: waiting for pending results... 23826 1726867456.51933: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' 23826 1726867456.52047: in run() - task 0affcac9-a3a5-a92d-a3ea-000000000665 23826 1726867456.52070: variable 'ansible_search_path' from source: unknown 23826 1726867456.52080: variable 'ansible_search_path' from source: unknown 23826 1726867456.52126: calling self._execute() 23826 1726867456.52230: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.52240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.52253: variable 'omit' from source: magic vars 23826 1726867456.52628: variable 'ansible_distribution_major_version' from source: facts 23826 1726867456.52749: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867456.52752: variable 'omit' from source: magic vars 23826 1726867456.52754: variable 'omit' from source: magic vars 23826 1726867456.52797: variable 'interface' from source: set_fact 23826 1726867456.52823: variable 'omit' from source: magic vars 23826 1726867456.52873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867456.52921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867456.52948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867456.52970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867456.52991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867456.53027: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867456.53037: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.53084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.53157: Set connection var ansible_timeout to 10 23826 1726867456.53170: Set connection var ansible_shell_executable to /bin/sh 23826 1726867456.53176: Set connection var ansible_connection to ssh 23826 1726867456.53195: Set connection var ansible_pipelining to False 23826 1726867456.53201: Set connection var ansible_shell_type to sh 23826 1726867456.53212: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867456.53240: variable 'ansible_shell_executable' from source: unknown 23826 1726867456.53300: variable 'ansible_connection' from source: unknown 23826 1726867456.53304: variable 'ansible_module_compression' from source: unknown 23826 1726867456.53306: variable 'ansible_shell_type' from source: unknown 23826 1726867456.53310: variable 'ansible_shell_executable' from source: unknown 23826 1726867456.53313: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.53315: variable 'ansible_pipelining' from source: unknown 23826 1726867456.53317: variable 'ansible_timeout' from source: unknown 23826 1726867456.53319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.53437: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867456.53453: variable 'omit' from source: magic vars 23826 1726867456.53464: starting attempt loop 23826 1726867456.53471: running the handler 23826 1726867456.53629: variable 'interface_stat' from source: set_fact 23826 1726867456.53644: Evaluated conditional (not interface_stat.stat.exists): True 23826 1726867456.53737: handler run complete 23826 1726867456.53740: attempt loop complete, returning result 23826 1726867456.53742: _execute() done 23826 1726867456.53744: dumping result to json 23826 1726867456.53746: done dumping result, returning 23826 1726867456.53748: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' [0affcac9-a3a5-a92d-a3ea-000000000665] 23826 1726867456.53750: sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000665 23826 1726867456.53816: done sending task result for task 0affcac9-a3a5-a92d-a3ea-000000000665 23826 1726867456.53820: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 23826 1726867456.53888: no more pending results, returning what we have 23826 1726867456.53892: results queue empty 23826 1726867456.53893: checking for any_errors_fatal 23826 1726867456.53903: done checking for any_errors_fatal 23826 1726867456.53904: checking for max_fail_percentage 23826 1726867456.53906: done checking for max_fail_percentage 23826 1726867456.53909: checking to see if all hosts have failed and the running result is not ok 23826 1726867456.53911: done checking to see if all hosts have failed 23826 1726867456.53912: getting the remaining hosts for this loop 23826 1726867456.53913: done getting the remaining hosts for this loop 23826 1726867456.53917: getting the next task for host managed_node2 23826 1726867456.53925: done getting next task for host managed_node2 23826 1726867456.53927: ^ task is: TASK: Verify network state restored to default 23826 1726867456.53929: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867456.53934: getting variables 23826 1726867456.53935: in VariableManager get_vars() 23826 1726867456.53966: Calling all_inventory to load vars for managed_node2 23826 1726867456.53969: Calling groups_inventory to load vars for managed_node2 23826 1726867456.53973: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867456.53989: Calling all_plugins_play to load vars for managed_node2 23826 1726867456.53994: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867456.53997: Calling groups_plugins_play to load vars for managed_node2 23826 1726867456.55728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867456.57363: done with get_vars() 23826 1726867456.57387: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:91 Friday 20 September 2024 17:24:16 -0400 (0:00:00.061) 0:00:38.586 ****** 23826 1726867456.57482: entering _queue_task() for managed_node2/include_tasks 23826 1726867456.57829: worker is 1 (out of 1 available) 23826 1726867456.57843: exiting _queue_task() for managed_node2/include_tasks 23826 1726867456.57855: done queuing things up, now waiting for results queue to drain 23826 1726867456.57857: waiting for pending results... 23826 1726867456.58296: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 23826 1726867456.58301: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000009f 23826 1726867456.58304: variable 'ansible_search_path' from source: unknown 23826 1726867456.58306: calling self._execute() 23826 1726867456.58396: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.58412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.58431: variable 'omit' from source: magic vars 23826 1726867456.58799: variable 'ansible_distribution_major_version' from source: facts 23826 1726867456.58821: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867456.58832: _execute() done 23826 1726867456.58840: dumping result to json 23826 1726867456.58847: done dumping result, returning 23826 1726867456.58862: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0affcac9-a3a5-a92d-a3ea-00000000009f] 23826 1726867456.58870: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000009f 23826 1726867456.59121: no more pending results, returning what we have 23826 1726867456.59127: in VariableManager get_vars() 23826 1726867456.59163: Calling all_inventory to load vars for managed_node2 23826 1726867456.59166: Calling groups_inventory to load vars for managed_node2 23826 1726867456.59169: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867456.59186: Calling all_plugins_play to load vars for managed_node2 23826 1726867456.59189: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867456.59193: Calling groups_plugins_play to load vars for managed_node2 23826 1726867456.59792: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000009f 23826 1726867456.59796: WORKER PROCESS EXITING 23826 1726867456.60795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867456.62424: done with get_vars() 23826 1726867456.62443: variable 'ansible_search_path' from source: unknown 23826 1726867456.62459: we have included files to process 23826 1726867456.62460: generating all_blocks data 23826 1726867456.62461: done generating all_blocks data 23826 1726867456.62465: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 23826 1726867456.62466: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 23826 1726867456.62468: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 23826 1726867456.62846: done processing included file 23826 1726867456.62848: iterating over new_blocks loaded from include file 23826 1726867456.62849: in VariableManager get_vars() 23826 1726867456.62860: done with get_vars() 23826 1726867456.62862: filtering new block on tags 23826 1726867456.62875: done filtering new block on tags 23826 1726867456.62877: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 23826 1726867456.62882: extending task lists for all hosts with included blocks 23826 1726867456.63169: done extending task lists 23826 1726867456.63171: done processing included files 23826 1726867456.63171: results queue empty 23826 1726867456.63172: checking for any_errors_fatal 23826 1726867456.63175: done checking for any_errors_fatal 23826 1726867456.63176: checking for max_fail_percentage 23826 1726867456.63179: done checking for max_fail_percentage 23826 1726867456.63180: checking to see if all hosts have failed and the running result is not ok 23826 1726867456.63181: done checking to see if all hosts have failed 23826 1726867456.63182: getting the remaining hosts for this loop 23826 1726867456.63183: done getting the remaining hosts for this loop 23826 1726867456.63186: getting the next task for host managed_node2 23826 1726867456.63190: done getting next task for host managed_node2 23826 1726867456.63193: ^ task is: TASK: Check routes and DNS 23826 1726867456.63195: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867456.63198: getting variables 23826 1726867456.63199: in VariableManager get_vars() 23826 1726867456.63211: Calling all_inventory to load vars for managed_node2 23826 1726867456.63213: Calling groups_inventory to load vars for managed_node2 23826 1726867456.63216: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867456.63222: Calling all_plugins_play to load vars for managed_node2 23826 1726867456.63224: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867456.63227: Calling groups_plugins_play to load vars for managed_node2 23826 1726867456.64482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867456.66002: done with get_vars() 23826 1726867456.66024: done getting variables 23826 1726867456.66064: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 17:24:16 -0400 (0:00:00.086) 0:00:38.672 ****** 23826 1726867456.66095: entering _queue_task() for managed_node2/shell 23826 1726867456.66434: worker is 1 (out of 1 available) 23826 1726867456.66446: exiting _queue_task() for managed_node2/shell 23826 1726867456.66458: done queuing things up, now waiting for results queue to drain 23826 1726867456.66460: waiting for pending results... 23826 1726867456.66751: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 23826 1726867456.66875: in run() - task 0affcac9-a3a5-a92d-a3ea-00000000069f 23826 1726867456.66906: variable 'ansible_search_path' from source: unknown 23826 1726867456.66918: variable 'ansible_search_path' from source: unknown 23826 1726867456.66956: calling self._execute() 23826 1726867456.67056: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.67068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.67088: variable 'omit' from source: magic vars 23826 1726867456.67467: variable 'ansible_distribution_major_version' from source: facts 23826 1726867456.67487: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867456.67498: variable 'omit' from source: magic vars 23826 1726867456.67544: variable 'omit' from source: magic vars 23826 1726867456.67589: variable 'omit' from source: magic vars 23826 1726867456.67635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867456.67684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867456.67711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867456.67735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867456.67882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867456.67886: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867456.67888: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.67891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.67911: Set connection var ansible_timeout to 10 23826 1726867456.67926: Set connection var ansible_shell_executable to /bin/sh 23826 1726867456.67937: Set connection var ansible_connection to ssh 23826 1726867456.67949: Set connection var ansible_pipelining to False 23826 1726867456.67956: Set connection var ansible_shell_type to sh 23826 1726867456.67966: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867456.67998: variable 'ansible_shell_executable' from source: unknown 23826 1726867456.68015: variable 'ansible_connection' from source: unknown 23826 1726867456.68024: variable 'ansible_module_compression' from source: unknown 23826 1726867456.68031: variable 'ansible_shell_type' from source: unknown 23826 1726867456.68039: variable 'ansible_shell_executable' from source: unknown 23826 1726867456.68047: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867456.68056: variable 'ansible_pipelining' from source: unknown 23826 1726867456.68064: variable 'ansible_timeout' from source: unknown 23826 1726867456.68072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867456.68225: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867456.68334: variable 'omit' from source: magic vars 23826 1726867456.68337: starting attempt loop 23826 1726867456.68340: running the handler 23826 1726867456.68343: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867456.68345: _low_level_execute_command(): starting 23826 1726867456.68347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867456.69038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867456.69051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867456.69095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867456.69116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867456.69198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.69231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.69312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.71000: stdout chunk (state=3): >>>/root <<< 23826 1726867456.71282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.71286: stdout chunk (state=3): >>><<< 23826 1726867456.71288: stderr chunk (state=3): >>><<< 23826 1726867456.71291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867456.71293: _low_level_execute_command(): starting 23826 1726867456.71296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748 `" && echo ansible-tmp-1726867456.7122197-25678-120738036456748="` echo /root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748 `" ) && sleep 0' 23826 1726867456.71861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867456.71867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867456.71880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867456.71904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867456.71942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.72010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867456.72021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.72044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.72107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.74055: stdout chunk (state=3): >>>ansible-tmp-1726867456.7122197-25678-120738036456748=/root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748 <<< 23826 1726867456.74225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.74229: stdout chunk (state=3): >>><<< 23826 1726867456.74231: stderr chunk (state=3): >>><<< 23826 1726867456.74259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867456.7122197-25678-120738036456748=/root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867456.74383: variable 'ansible_module_compression' from source: unknown 23826 1726867456.74387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867456.74413: variable 'ansible_facts' from source: unknown 23826 1726867456.74511: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/AnsiballZ_command.py 23826 1726867456.74735: Sending initial data 23826 1726867456.74744: Sent initial data (156 bytes) 23826 1726867456.75395: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.75411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867456.75513: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867456.75518: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.75520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867456.75522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.75547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.75615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.77202: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 23826 1726867456.77236: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867456.77320: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867456.77363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpe9m3uyun /root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/AnsiballZ_command.py <<< 23826 1726867456.77366: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/AnsiballZ_command.py" <<< 23826 1726867456.77429: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpe9m3uyun" to remote "/root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/AnsiballZ_command.py" <<< 23826 1726867456.78184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.78269: stderr chunk (state=3): >>><<< 23826 1726867456.78272: stdout chunk (state=3): >>><<< 23826 1726867456.78313: done transferring module to remote 23826 1726867456.78397: _low_level_execute_command(): starting 23826 1726867456.78400: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/ /root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/AnsiballZ_command.py && sleep 0' 23826 1726867456.78914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867456.78922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867456.78935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867456.78947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867456.78960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867456.78968: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867456.78979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.78993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867456.79046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867456.79079: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867456.79127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.79130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.79168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.81196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867456.81200: stdout chunk (state=3): >>><<< 23826 1726867456.81203: stderr chunk (state=3): >>><<< 23826 1726867456.81205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867456.81207: _low_level_execute_command(): starting 23826 1726867456.81210: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/AnsiballZ_command.py && sleep 0' 23826 1726867456.81860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867456.81916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867456.81938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867456.81983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867456.82027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867456.98389: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3190sec preferred_lft 3190sec\n inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:24:16.973401", "end": "2024-09-20 17:24:16.982135", "delta": "0:00:00.008734", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867457.00084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867457.00088: stdout chunk (state=3): >>><<< 23826 1726867457.00091: stderr chunk (state=3): >>><<< 23826 1726867457.00093: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3190sec preferred_lft 3190sec\n inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:24:16.973401", "end": "2024-09-20 17:24:16.982135", "delta": "0:00:00.008734", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867457.00102: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867457.00105: _low_level_execute_command(): starting 23826 1726867457.00110: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867456.7122197-25678-120738036456748/ > /dev/null 2>&1 && sleep 0' 23826 1726867457.00744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867457.00752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867457.00763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867457.00779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867457.00792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 23826 1726867457.00882: stderr chunk (state=3): >>>debug2: match not found <<< 23826 1726867457.00885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867457.00888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 23826 1726867457.00890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 23826 1726867457.00892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 23826 1726867457.00894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867457.00896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867457.00933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867457.00959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867457.00970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867457.01039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867457.02976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867457.02983: stdout chunk (state=3): >>><<< 23826 1726867457.02986: stderr chunk (state=3): >>><<< 23826 1726867457.03015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867457.03018: handler run complete 23826 1726867457.03048: Evaluated conditional (False): False 23826 1726867457.03060: attempt loop complete, returning result 23826 1726867457.03063: _execute() done 23826 1726867457.03065: dumping result to json 23826 1726867457.03283: done dumping result, returning 23826 1726867457.03287: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0affcac9-a3a5-a92d-a3ea-00000000069f] 23826 1726867457.03289: sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000069f 23826 1726867457.03367: done sending task result for task 0affcac9-a3a5-a92d-a3ea-00000000069f 23826 1726867457.03371: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008734", "end": "2024-09-20 17:24:16.982135", "rc": 0, "start": "2024-09-20 17:24:16.973401" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3190sec preferred_lft 3190sec inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 23826 1726867457.03457: no more pending results, returning what we have 23826 1726867457.03461: results queue empty 23826 1726867457.03462: checking for any_errors_fatal 23826 1726867457.03464: done checking for any_errors_fatal 23826 1726867457.03464: checking for max_fail_percentage 23826 1726867457.03466: done checking for max_fail_percentage 23826 1726867457.03467: checking to see if all hosts have failed and the running result is not ok 23826 1726867457.03469: done checking to see if all hosts have failed 23826 1726867457.03469: getting the remaining hosts for this loop 23826 1726867457.03471: done getting the remaining hosts for this loop 23826 1726867457.03475: getting the next task for host managed_node2 23826 1726867457.03488: done getting next task for host managed_node2 23826 1726867457.03491: ^ task is: TASK: Verify DNS and network connectivity 23826 1726867457.03495: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867457.03505: getting variables 23826 1726867457.03506: in VariableManager get_vars() 23826 1726867457.03538: Calling all_inventory to load vars for managed_node2 23826 1726867457.03541: Calling groups_inventory to load vars for managed_node2 23826 1726867457.03546: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867457.03558: Calling all_plugins_play to load vars for managed_node2 23826 1726867457.03561: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867457.03563: Calling groups_plugins_play to load vars for managed_node2 23826 1726867457.06702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867457.08364: done with get_vars() 23826 1726867457.08390: done getting variables 23826 1726867457.08473: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 17:24:17 -0400 (0:00:00.424) 0:00:39.096 ****** 23826 1726867457.08507: entering _queue_task() for managed_node2/shell 23826 1726867457.08972: worker is 1 (out of 1 available) 23826 1726867457.09035: exiting _queue_task() for managed_node2/shell 23826 1726867457.09045: done queuing things up, now waiting for results queue to drain 23826 1726867457.09046: waiting for pending results... 23826 1726867457.09294: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 23826 1726867457.09582: in run() - task 0affcac9-a3a5-a92d-a3ea-0000000006a0 23826 1726867457.09586: variable 'ansible_search_path' from source: unknown 23826 1726867457.09588: variable 'ansible_search_path' from source: unknown 23826 1726867457.09595: calling self._execute() 23826 1726867457.09620: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867457.09626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867457.09636: variable 'omit' from source: magic vars 23826 1726867457.10186: variable 'ansible_distribution_major_version' from source: facts 23826 1726867457.10197: Evaluated conditional (ansible_distribution_major_version != '6'): True 23826 1726867457.10442: variable 'ansible_facts' from source: unknown 23826 1726867457.11403: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 23826 1726867457.11411: variable 'omit' from source: magic vars 23826 1726867457.11453: variable 'omit' from source: magic vars 23826 1726867457.11486: variable 'omit' from source: magic vars 23826 1726867457.11526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 23826 1726867457.11568: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 23826 1726867457.11588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 23826 1726867457.11605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867457.11617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 23826 1726867457.11647: variable 'inventory_hostname' from source: host vars for 'managed_node2' 23826 1726867457.11659: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867457.11662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867457.11766: Set connection var ansible_timeout to 10 23826 1726867457.11775: Set connection var ansible_shell_executable to /bin/sh 23826 1726867457.11780: Set connection var ansible_connection to ssh 23826 1726867457.11787: Set connection var ansible_pipelining to False 23826 1726867457.11790: Set connection var ansible_shell_type to sh 23826 1726867457.11984: Set connection var ansible_module_compression to ZIP_DEFLATED 23826 1726867457.11987: variable 'ansible_shell_executable' from source: unknown 23826 1726867457.11990: variable 'ansible_connection' from source: unknown 23826 1726867457.11993: variable 'ansible_module_compression' from source: unknown 23826 1726867457.11995: variable 'ansible_shell_type' from source: unknown 23826 1726867457.11997: variable 'ansible_shell_executable' from source: unknown 23826 1726867457.11999: variable 'ansible_host' from source: host vars for 'managed_node2' 23826 1726867457.12001: variable 'ansible_pipelining' from source: unknown 23826 1726867457.12003: variable 'ansible_timeout' from source: unknown 23826 1726867457.12005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 23826 1726867457.12011: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867457.12013: variable 'omit' from source: magic vars 23826 1726867457.12015: starting attempt loop 23826 1726867457.12018: running the handler 23826 1726867457.12020: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 23826 1726867457.12022: _low_level_execute_command(): starting 23826 1726867457.12032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 23826 1726867457.12752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867457.12787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867457.12794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867457.12853: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867457.12896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867457.12916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867457.12930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867457.13006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867457.14689: stdout chunk (state=3): >>>/root <<< 23826 1726867457.14834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867457.14838: stdout chunk (state=3): >>><<< 23826 1726867457.14840: stderr chunk (state=3): >>><<< 23826 1726867457.14959: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867457.14963: _low_level_execute_command(): starting 23826 1726867457.14965: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670 `" && echo ansible-tmp-1726867457.148675-25699-142447931998670="` echo /root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670 `" ) && sleep 0' 23826 1726867457.15544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867457.15560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867457.15573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867457.15640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867457.17543: stdout chunk (state=3): >>>ansible-tmp-1726867457.148675-25699-142447931998670=/root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670 <<< 23826 1726867457.17735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867457.17739: stdout chunk (state=3): >>><<< 23826 1726867457.17741: stderr chunk (state=3): >>><<< 23826 1726867457.17883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867457.148675-25699-142447931998670=/root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867457.17886: variable 'ansible_module_compression' from source: unknown 23826 1726867457.17889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-238264436z43w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 23826 1726867457.17912: variable 'ansible_facts' from source: unknown 23826 1726867457.18013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/AnsiballZ_command.py 23826 1726867457.18246: Sending initial data 23826 1726867457.18249: Sent initial data (155 bytes) 23826 1726867457.18844: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867457.18861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867457.18887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867457.18930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867457.18949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867457.18994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867457.19053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867457.19070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867457.19100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867457.19179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867457.20801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 23826 1726867457.20844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 23826 1726867457.20904: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-238264436z43w/tmpmnk0x3ft /root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/AnsiballZ_command.py <<< 23826 1726867457.20909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/AnsiballZ_command.py" <<< 23826 1726867457.20942: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-238264436z43w/tmpmnk0x3ft" to remote "/root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/AnsiballZ_command.py" <<< 23826 1726867457.21884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867457.21931: stderr chunk (state=3): >>><<< 23826 1726867457.21940: stdout chunk (state=3): >>><<< 23826 1726867457.21964: done transferring module to remote 23826 1726867457.21981: _low_level_execute_command(): starting 23826 1726867457.22018: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/ /root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/AnsiballZ_command.py && sleep 0' 23826 1726867457.22684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867457.22735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867457.22755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867457.22770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867457.22851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867457.24703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867457.24706: stdout chunk (state=3): >>><<< 23826 1726867457.24712: stderr chunk (state=3): >>><<< 23826 1726867457.24803: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867457.24806: _low_level_execute_command(): starting 23826 1726867457.24812: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/AnsiballZ_command.py && sleep 0' 23826 1726867457.25351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 23826 1726867457.25366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 23826 1726867457.25383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 23826 1726867457.25402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 23826 1726867457.25497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 23826 1726867457.25521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 23826 1726867457.25536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867457.25561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867457.25646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867457.50657: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 14686 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7000 0 --:--:-- --:--:-- --:--:-- 7097", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:24:17.410117", "end": "2024-09-20 17:24:17.504944", "delta": "0:00:00.094827", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 23826 1726867457.52335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 23826 1726867457.52339: stdout chunk (state=3): >>><<< 23826 1726867457.52342: stderr chunk (state=3): >>><<< 23826 1726867457.52359: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 14686 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7000 0 --:--:-- --:--:-- --:--:-- 7097", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:24:17.410117", "end": "2024-09-20 17:24:17.504944", "delta": "0:00:00.094827", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 23826 1726867457.52399: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 23826 1726867457.52409: _low_level_execute_command(): starting 23826 1726867457.52412: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867457.148675-25699-142447931998670/ > /dev/null 2>&1 && sleep 0' 23826 1726867457.53022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 23826 1726867457.53046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 23826 1726867457.53117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 23826 1726867457.54957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 23826 1726867457.54982: stderr chunk (state=3): >>><<< 23826 1726867457.54986: stdout chunk (state=3): >>><<< 23826 1726867457.54996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 23826 1726867457.55002: handler run complete 23826 1726867457.55020: Evaluated conditional (False): False 23826 1726867457.55028: attempt loop complete, returning result 23826 1726867457.55031: _execute() done 23826 1726867457.55033: dumping result to json 23826 1726867457.55039: done dumping result, returning 23826 1726867457.55046: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0affcac9-a3a5-a92d-a3ea-0000000006a0] 23826 1726867457.55050: sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000006a0 23826 1726867457.55151: done sending task result for task 0affcac9-a3a5-a92d-a3ea-0000000006a0 23826 1726867457.55154: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.094827", "end": "2024-09-20 17:24:17.504944", "rc": 0, "start": "2024-09-20 17:24:17.410117" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 14686 0 --:--:-- --:--:-- --:--:-- 15250 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 7000 0 --:--:-- --:--:-- --:--:-- 7097 23826 1726867457.55241: no more pending results, returning what we have 23826 1726867457.55244: results queue empty 23826 1726867457.55245: checking for any_errors_fatal 23826 1726867457.55254: done checking for any_errors_fatal 23826 1726867457.55255: checking for max_fail_percentage 23826 1726867457.55256: done checking for max_fail_percentage 23826 1726867457.55257: checking to see if all hosts have failed and the running result is not ok 23826 1726867457.55264: done checking to see if all hosts have failed 23826 1726867457.55264: getting the remaining hosts for this loop 23826 1726867457.55266: done getting the remaining hosts for this loop 23826 1726867457.55270: getting the next task for host managed_node2 23826 1726867457.55279: done getting next task for host managed_node2 23826 1726867457.55281: ^ task is: TASK: meta (flush_handlers) 23826 1726867457.55283: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867457.55291: getting variables 23826 1726867457.55292: in VariableManager get_vars() 23826 1726867457.55321: Calling all_inventory to load vars for managed_node2 23826 1726867457.55324: Calling groups_inventory to load vars for managed_node2 23826 1726867457.55327: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867457.55337: Calling all_plugins_play to load vars for managed_node2 23826 1726867457.55339: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867457.55342: Calling groups_plugins_play to load vars for managed_node2 23826 1726867457.60107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867457.61339: done with get_vars() 23826 1726867457.61354: done getting variables 23826 1726867457.61395: in VariableManager get_vars() 23826 1726867457.61403: Calling all_inventory to load vars for managed_node2 23826 1726867457.61404: Calling groups_inventory to load vars for managed_node2 23826 1726867457.61406: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867457.61410: Calling all_plugins_play to load vars for managed_node2 23826 1726867457.61411: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867457.61413: Calling groups_plugins_play to load vars for managed_node2 23826 1726867457.62020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867457.62944: done with get_vars() 23826 1726867457.62960: done queuing things up, now waiting for results queue to drain 23826 1726867457.62961: results queue empty 23826 1726867457.62961: checking for any_errors_fatal 23826 1726867457.62964: done checking for any_errors_fatal 23826 1726867457.62964: checking for max_fail_percentage 23826 1726867457.62965: done checking for max_fail_percentage 23826 1726867457.62965: checking to see if all hosts have failed and the running result is not ok 23826 1726867457.62966: done checking to see if all hosts have failed 23826 1726867457.62966: getting the remaining hosts for this loop 23826 1726867457.62967: done getting the remaining hosts for this loop 23826 1726867457.62969: getting the next task for host managed_node2 23826 1726867457.62971: done getting next task for host managed_node2 23826 1726867457.62972: ^ task is: TASK: meta (flush_handlers) 23826 1726867457.62973: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867457.62975: getting variables 23826 1726867457.62975: in VariableManager get_vars() 23826 1726867457.62982: Calling all_inventory to load vars for managed_node2 23826 1726867457.62983: Calling groups_inventory to load vars for managed_node2 23826 1726867457.62985: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867457.62988: Calling all_plugins_play to load vars for managed_node2 23826 1726867457.62990: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867457.62991: Calling groups_plugins_play to load vars for managed_node2 23826 1726867457.63617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867457.64459: done with get_vars() 23826 1726867457.64471: done getting variables 23826 1726867457.64504: in VariableManager get_vars() 23826 1726867457.64510: Calling all_inventory to load vars for managed_node2 23826 1726867457.64511: Calling groups_inventory to load vars for managed_node2 23826 1726867457.64513: Calling all_plugins_inventory to load vars for managed_node2 23826 1726867457.64515: Calling all_plugins_play to load vars for managed_node2 23826 1726867457.64517: Calling groups_plugins_inventory to load vars for managed_node2 23826 1726867457.64519: Calling groups_plugins_play to load vars for managed_node2 23826 1726867457.65194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 23826 1726867457.66042: done with get_vars() 23826 1726867457.66058: done queuing things up, now waiting for results queue to drain 23826 1726867457.66059: results queue empty 23826 1726867457.66060: checking for any_errors_fatal 23826 1726867457.66061: done checking for any_errors_fatal 23826 1726867457.66061: checking for max_fail_percentage 23826 1726867457.66062: done checking for max_fail_percentage 23826 1726867457.66062: checking to see if all hosts have failed and the running result is not ok 23826 1726867457.66062: done checking to see if all hosts have failed 23826 1726867457.66063: getting the remaining hosts for this loop 23826 1726867457.66063: done getting the remaining hosts for this loop 23826 1726867457.66065: getting the next task for host managed_node2 23826 1726867457.66067: done getting next task for host managed_node2 23826 1726867457.66068: ^ task is: None 23826 1726867457.66068: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 23826 1726867457.66069: done queuing things up, now waiting for results queue to drain 23826 1726867457.66070: results queue empty 23826 1726867457.66070: checking for any_errors_fatal 23826 1726867457.66070: done checking for any_errors_fatal 23826 1726867457.66071: checking for max_fail_percentage 23826 1726867457.66071: done checking for max_fail_percentage 23826 1726867457.66072: checking to see if all hosts have failed and the running result is not ok 23826 1726867457.66072: done checking to see if all hosts have failed 23826 1726867457.66073: getting the next task for host managed_node2 23826 1726867457.66074: done getting next task for host managed_node2 23826 1726867457.66074: ^ task is: None 23826 1726867457.66075: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=75 changed=2 unreachable=0 failed=0 skipped=75 rescued=0 ignored=1 Friday 20 September 2024 17:24:17 -0400 (0:00:00.576) 0:00:39.672 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.02s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.99s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.83s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.66s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Create veth interface ethtest0 ------------------------------------------ 1.31s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.28s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.22s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.18s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Install iproute --------------------------------------------------------- 1.16s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.12s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.92s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.79s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.78s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.75s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.74s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.68s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.68s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.67s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 23826 1726867457.66133: RUNNING CLEANUP