[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 13131 1726867184.13015: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 13131 1726867184.13879: Added group all to inventory 13131 1726867184.13882: Added group ungrouped to inventory 13131 1726867184.13886: Group all now contains ungrouped 13131 1726867184.13889: Examining possible inventory source: /tmp/network-5rw/inventory.yml 13131 1726867184.38059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 13131 1726867184.38238: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 13131 1726867184.38267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 13131 1726867184.38417: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 13131 1726867184.38556: Loaded config def from plugin (inventory/script) 13131 1726867184.38558: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 13131 1726867184.38604: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 13131 1726867184.38711: Loaded config def from plugin (inventory/yaml) 13131 1726867184.38713: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 13131 1726867184.38861: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 13131 1726867184.39750: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 13131 1726867184.39754: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 13131 1726867184.39757: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 13131 1726867184.39763: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 13131 1726867184.39767: Loading data from /tmp/network-5rw/inventory.yml 13131 1726867184.39874: /tmp/network-5rw/inventory.yml was not parsable by auto 13131 1726867184.40011: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 13131 1726867184.40065: Loading data from /tmp/network-5rw/inventory.yml 13131 1726867184.40157: group all already in inventory 13131 1726867184.40164: set inventory_file for managed_node1 13131 1726867184.40167: set inventory_dir for managed_node1 13131 1726867184.40168: Added host managed_node1 to inventory 13131 1726867184.40171: Added host managed_node1 to group all 13131 1726867184.40172: set ansible_host for managed_node1 13131 1726867184.40172: set ansible_ssh_extra_args for managed_node1 13131 1726867184.40176: set inventory_file for managed_node2 13131 1726867184.40181: set inventory_dir for managed_node2 13131 1726867184.40182: Added host managed_node2 to inventory 13131 1726867184.40183: Added host managed_node2 to group all 13131 1726867184.40184: set ansible_host for managed_node2 13131 1726867184.40185: set ansible_ssh_extra_args for managed_node2 13131 1726867184.40187: set inventory_file for managed_node3 13131 1726867184.40189: set inventory_dir for managed_node3 13131 1726867184.40193: Added host managed_node3 to inventory 13131 1726867184.40194: Added host managed_node3 to group all 13131 1726867184.40195: set ansible_host for managed_node3 13131 1726867184.40196: set ansible_ssh_extra_args for managed_node3 13131 1726867184.40198: Reconcile groups and hosts in inventory. 13131 1726867184.40202: Group ungrouped now contains managed_node1 13131 1726867184.40204: Group ungrouped now contains managed_node2 13131 1726867184.40206: Group ungrouped now contains managed_node3 13131 1726867184.40288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 13131 1726867184.40419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 13131 1726867184.40470: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 13131 1726867184.40503: Loaded config def from plugin (vars/host_group_vars) 13131 1726867184.40505: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 13131 1726867184.40513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 13131 1726867184.40520: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 13131 1726867184.40561: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 13131 1726867184.40974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867184.41112: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 13131 1726867184.41161: Loaded config def from plugin (connection/local) 13131 1726867184.41165: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 13131 1726867184.42072: Loaded config def from plugin (connection/paramiko_ssh) 13131 1726867184.42075: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 13131 1726867184.43730: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13131 1726867184.43781: Loaded config def from plugin (connection/psrp) 13131 1726867184.43785: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 13131 1726867184.45472: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13131 1726867184.45639: Loaded config def from plugin (connection/ssh) 13131 1726867184.45643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 13131 1726867184.48145: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13131 1726867184.48331: Loaded config def from plugin (connection/winrm) 13131 1726867184.48335: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 13131 1726867184.48369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 13131 1726867184.48437: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 13131 1726867184.48618: Loaded config def from plugin (shell/cmd) 13131 1726867184.48620: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 13131 1726867184.48647: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 13131 1726867184.48722: Loaded config def from plugin (shell/powershell) 13131 1726867184.48725: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 13131 1726867184.48779: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 13131 1726867184.48973: Loaded config def from plugin (shell/sh) 13131 1726867184.48975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 13131 1726867184.49016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 13131 1726867184.49140: Loaded config def from plugin (become/runas) 13131 1726867184.49143: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 13131 1726867184.49341: Loaded config def from plugin (become/su) 13131 1726867184.49343: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 13131 1726867184.49511: Loaded config def from plugin (become/sudo) 13131 1726867184.49513: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 13131 1726867184.49545: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 13131 1726867184.49980: in VariableManager get_vars() 13131 1726867184.50040: done with get_vars() 13131 1726867184.50197: trying /usr/local/lib/python3.12/site-packages/ansible/modules 13131 1726867184.53560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 13131 1726867184.53688: in VariableManager get_vars() 13131 1726867184.53696: done with get_vars() 13131 1726867184.53699: variable 'playbook_dir' from source: magic vars 13131 1726867184.53700: variable 'ansible_playbook_python' from source: magic vars 13131 1726867184.53701: variable 'ansible_config_file' from source: magic vars 13131 1726867184.53702: variable 'groups' from source: magic vars 13131 1726867184.53702: variable 'omit' from source: magic vars 13131 1726867184.53703: variable 'ansible_version' from source: magic vars 13131 1726867184.53704: variable 'ansible_check_mode' from source: magic vars 13131 1726867184.53705: variable 'ansible_diff_mode' from source: magic vars 13131 1726867184.53705: variable 'ansible_forks' from source: magic vars 13131 1726867184.53706: variable 'ansible_inventory_sources' from source: magic vars 13131 1726867184.53707: variable 'ansible_skip_tags' from source: magic vars 13131 1726867184.53708: variable 'ansible_limit' from source: magic vars 13131 1726867184.53708: variable 'ansible_run_tags' from source: magic vars 13131 1726867184.53709: variable 'ansible_verbosity' from source: magic vars 13131 1726867184.53748: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 13131 1726867184.55045: in VariableManager get_vars() 13131 1726867184.55061: done with get_vars() 13131 1726867184.55071: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13131 1726867184.56585: in VariableManager get_vars() 13131 1726867184.56602: done with get_vars() 13131 1726867184.56611: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13131 1726867184.56818: in VariableManager get_vars() 13131 1726867184.56833: done with get_vars() 13131 1726867184.57086: in VariableManager get_vars() 13131 1726867184.57122: done with get_vars() 13131 1726867184.57132: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13131 1726867184.57267: in VariableManager get_vars() 13131 1726867184.57317: done with get_vars() 13131 1726867184.57830: in VariableManager get_vars() 13131 1726867184.57843: done with get_vars() 13131 1726867184.57848: variable 'omit' from source: magic vars 13131 1726867184.57989: variable 'omit' from source: magic vars 13131 1726867184.58026: in VariableManager get_vars() 13131 1726867184.58038: done with get_vars() 13131 1726867184.58194: in VariableManager get_vars() 13131 1726867184.58207: done with get_vars() 13131 1726867184.58242: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13131 1726867184.58720: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13131 1726867184.58968: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13131 1726867184.62576: in VariableManager get_vars() 13131 1726867184.62606: done with get_vars() 13131 1726867184.63267: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 13131 1726867184.63453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867184.66026: in VariableManager get_vars() 13131 1726867184.66046: done with get_vars() 13131 1726867184.66054: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13131 1726867184.66151: in VariableManager get_vars() 13131 1726867184.66170: done with get_vars() 13131 1726867184.66298: in VariableManager get_vars() 13131 1726867184.66314: done with get_vars() 13131 1726867184.66603: in VariableManager get_vars() 13131 1726867184.66619: done with get_vars() 13131 1726867184.66624: variable 'omit' from source: magic vars 13131 1726867184.66634: variable 'omit' from source: magic vars 13131 1726867184.66812: variable 'controller_profile' from source: play vars 13131 1726867184.66934: in VariableManager get_vars() 13131 1726867184.66948: done with get_vars() 13131 1726867184.66967: in VariableManager get_vars() 13131 1726867184.67099: done with get_vars() 13131 1726867184.67128: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13131 1726867184.67367: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13131 1726867184.67652: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13131 1726867184.68193: in VariableManager get_vars() 13131 1726867184.68214: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867184.71008: in VariableManager get_vars() 13131 1726867184.71027: done with get_vars() 13131 1726867184.71031: variable 'omit' from source: magic vars 13131 1726867184.71041: variable 'omit' from source: magic vars 13131 1726867184.71075: in VariableManager get_vars() 13131 1726867184.71094: done with get_vars() 13131 1726867184.71119: in VariableManager get_vars() 13131 1726867184.71137: done with get_vars() 13131 1726867184.71169: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13131 1726867184.71292: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13131 1726867184.71385: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13131 1726867184.71795: in VariableManager get_vars() 13131 1726867184.71821: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867184.73834: in VariableManager get_vars() 13131 1726867184.73852: done with get_vars() 13131 1726867184.73855: variable 'omit' from source: magic vars 13131 1726867184.73864: variable 'omit' from source: magic vars 13131 1726867184.73899: in VariableManager get_vars() 13131 1726867184.73931: done with get_vars() 13131 1726867184.73950: in VariableManager get_vars() 13131 1726867184.73969: done with get_vars() 13131 1726867184.74003: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13131 1726867184.74117: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13131 1726867184.74207: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13131 1726867184.75011: in VariableManager get_vars() 13131 1726867184.75151: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867184.79063: in VariableManager get_vars() 13131 1726867184.79089: done with get_vars() 13131 1726867184.79097: variable 'omit' from source: magic vars 13131 1726867184.79120: variable 'omit' from source: magic vars 13131 1726867184.79161: in VariableManager get_vars() 13131 1726867184.79307: done with get_vars() 13131 1726867184.79327: in VariableManager get_vars() 13131 1726867184.79348: done with get_vars() 13131 1726867184.79537: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13131 1726867184.79784: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13131 1726867184.79866: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13131 1726867184.80381: in VariableManager get_vars() 13131 1726867184.80410: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867184.83503: in VariableManager get_vars() 13131 1726867184.83884: done with get_vars() 13131 1726867184.83895: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13131 1726867184.84899: in VariableManager get_vars() 13131 1726867184.84928: done with get_vars() 13131 1726867184.85064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 13131 1726867184.85224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 13131 1726867184.86006: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 13131 1726867184.86394: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 13131 1726867184.86511: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 13131 1726867184.86542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 13131 1726867184.86567: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 13131 1726867184.87021: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 13131 1726867184.87084: Loaded config def from plugin (callback/default) 13131 1726867184.87221: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13131 1726867184.88664: Loaded config def from plugin (callback/junit) 13131 1726867184.88666: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13131 1726867184.88715: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 13131 1726867184.88786: Loaded config def from plugin (callback/minimal) 13131 1726867184.88789: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13131 1726867184.88828: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13131 1726867184.88897: Loaded config def from plugin (callback/tree) 13131 1726867184.88900: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 13131 1726867184.89031: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 13131 1726867184.89033: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_nm.yml ******************************************** 2 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 13131 1726867184.89206: in VariableManager get_vars() 13131 1726867184.89220: done with get_vars() 13131 1726867184.89226: in VariableManager get_vars() 13131 1726867184.89236: done with get_vars() 13131 1726867184.89244: variable 'omit' from source: magic vars 13131 1726867184.89431: in VariableManager get_vars() 13131 1726867184.89445: done with get_vars() 13131 1726867184.89465: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with nm as provider] ***** 13131 1726867184.90651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 13131 1726867184.90723: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 13131 1726867184.90904: getting the remaining hosts for this loop 13131 1726867184.90906: done getting the remaining hosts for this loop 13131 1726867184.90909: getting the next task for host managed_node1 13131 1726867184.90914: done getting next task for host managed_node1 13131 1726867184.90915: ^ task is: TASK: Gathering Facts 13131 1726867184.90917: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867184.90919: getting variables 13131 1726867184.90920: in VariableManager get_vars() 13131 1726867184.90929: Calling all_inventory to load vars for managed_node1 13131 1726867184.90931: Calling groups_inventory to load vars for managed_node1 13131 1726867184.90933: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867184.90944: Calling all_plugins_play to load vars for managed_node1 13131 1726867184.90955: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867184.91080: Calling groups_plugins_play to load vars for managed_node1 13131 1726867184.91115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867184.91167: done with get_vars() 13131 1726867184.91173: done getting variables 13131 1726867184.91259: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 Friday 20 September 2024 17:19:44 -0400 (0:00:00.023) 0:00:00.023 ****** 13131 1726867184.91281: entering _queue_task() for managed_node1/gather_facts 13131 1726867184.91283: Creating lock for gather_facts 13131 1726867184.91758: worker is 1 (out of 1 available) 13131 1726867184.91768: exiting _queue_task() for managed_node1/gather_facts 13131 1726867184.91781: done queuing things up, now waiting for results queue to drain 13131 1726867184.91782: waiting for pending results... 13131 1726867184.91921: running TaskExecutor() for managed_node1/TASK: Gathering Facts 13131 1726867184.92059: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001bc 13131 1726867184.92063: variable 'ansible_search_path' from source: unknown 13131 1726867184.92090: calling self._execute() 13131 1726867184.92159: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867184.92275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867184.92281: variable 'omit' from source: magic vars 13131 1726867184.92295: variable 'omit' from source: magic vars 13131 1726867184.92329: variable 'omit' from source: magic vars 13131 1726867184.92367: variable 'omit' from source: magic vars 13131 1726867184.92419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867184.92469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867184.92523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867184.92552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867184.92586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867184.92627: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867184.92636: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867184.92643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867184.92818: Set connection var ansible_connection to ssh 13131 1726867184.92823: Set connection var ansible_timeout to 10 13131 1726867184.92862: Set connection var ansible_shell_type to sh 13131 1726867184.92865: Set connection var ansible_shell_executable to /bin/sh 13131 1726867184.92886: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867184.92943: Set connection var ansible_pipelining to False 13131 1726867184.93145: variable 'ansible_shell_executable' from source: unknown 13131 1726867184.93148: variable 'ansible_connection' from source: unknown 13131 1726867184.93150: variable 'ansible_module_compression' from source: unknown 13131 1726867184.93152: variable 'ansible_shell_type' from source: unknown 13131 1726867184.93154: variable 'ansible_shell_executable' from source: unknown 13131 1726867184.93156: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867184.93158: variable 'ansible_pipelining' from source: unknown 13131 1726867184.93160: variable 'ansible_timeout' from source: unknown 13131 1726867184.93162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867184.93530: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867184.93533: variable 'omit' from source: magic vars 13131 1726867184.93536: starting attempt loop 13131 1726867184.93539: running the handler 13131 1726867184.93541: variable 'ansible_facts' from source: unknown 13131 1726867184.93543: _low_level_execute_command(): starting 13131 1726867184.93554: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867184.94786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867184.94809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867184.94896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867184.94934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867184.94957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867184.94974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867184.95111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867184.96765: stdout chunk (state=3): >>>/root <<< 13131 1726867184.96921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867184.96935: stdout chunk (state=3): >>><<< 13131 1726867184.96973: stderr chunk (state=3): >>><<< 13131 1726867184.97000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867184.97286: _low_level_execute_command(): starting 13131 1726867184.97290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764 `" && echo ansible-tmp-1726867184.9720309-13187-279519658609764="` echo /root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764 `" ) && sleep 0' 13131 1726867184.98333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867184.98346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867184.98366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867184.98386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867184.98407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867184.98453: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867184.98529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867184.98557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867184.98572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867184.98659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867185.00704: stdout chunk (state=3): >>>ansible-tmp-1726867184.9720309-13187-279519658609764=/root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764 <<< 13131 1726867185.00725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867185.00784: stderr chunk (state=3): >>><<< 13131 1726867185.00787: stdout chunk (state=3): >>><<< 13131 1726867185.00804: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867184.9720309-13187-279519658609764=/root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867185.00884: variable 'ansible_module_compression' from source: unknown 13131 1726867185.00907: ANSIBALLZ: Using generic lock for ansible.legacy.setup 13131 1726867185.00914: ANSIBALLZ: Acquiring lock 13131 1726867185.00922: ANSIBALLZ: Lock acquired: 140192901613856 13131 1726867185.00929: ANSIBALLZ: Creating module 13131 1726867185.35615: ANSIBALLZ: Writing module into payload 13131 1726867185.35763: ANSIBALLZ: Writing module 13131 1726867185.35853: ANSIBALLZ: Renaming module 13131 1726867185.35869: ANSIBALLZ: Done creating module 13131 1726867185.36046: variable 'ansible_facts' from source: unknown 13131 1726867185.36050: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867185.36056: _low_level_execute_command(): starting 13131 1726867185.36068: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 13131 1726867185.37175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867185.37204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867185.37361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867185.37605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867185.37653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867185.39332: stdout chunk (state=3): >>>PLATFORM <<< 13131 1726867185.39412: stdout chunk (state=3): >>>Linux <<< 13131 1726867185.39445: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 13131 1726867185.39609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867185.39620: stdout chunk (state=3): >>><<< 13131 1726867185.39642: stderr chunk (state=3): >>><<< 13131 1726867185.39738: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867185.39744 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 13131 1726867185.39804: _low_level_execute_command(): starting 13131 1726867185.39827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 13131 1726867185.40222: Sending initial data 13131 1726867185.40225: Sent initial data (1181 bytes) 13131 1726867185.41282: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867185.41300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867185.41327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867185.41353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867185.41370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867185.41735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867185.44880: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 13131 1726867185.45297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867185.45307: stdout chunk (state=3): >>><<< 13131 1726867185.45317: stderr chunk (state=3): >>><<< 13131 1726867185.45395: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867185.45486: variable 'ansible_facts' from source: unknown 13131 1726867185.45588: variable 'ansible_facts' from source: unknown 13131 1726867185.45641: variable 'ansible_module_compression' from source: unknown 13131 1726867185.45799: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13131 1726867185.45833: variable 'ansible_facts' from source: unknown 13131 1726867185.46219: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/AnsiballZ_setup.py 13131 1726867185.46411: Sending initial data 13131 1726867185.46415: Sent initial data (154 bytes) 13131 1726867185.47027: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867185.47098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867185.47165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867185.47188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867185.47218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867185.47287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867185.48897: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867185.48934: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867185.48994: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp34fanv5v /root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/AnsiballZ_setup.py <<< 13131 1726867185.48997: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/AnsiballZ_setup.py" <<< 13131 1726867185.49028: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp34fanv5v" to remote "/root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/AnsiballZ_setup.py" <<< 13131 1726867185.50555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867185.50686: stderr chunk (state=3): >>><<< 13131 1726867185.50689: stdout chunk (state=3): >>><<< 13131 1726867185.50695: done transferring module to remote 13131 1726867185.50697: _low_level_execute_command(): starting 13131 1726867185.50699: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/ /root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/AnsiballZ_setup.py && sleep 0' 13131 1726867185.51502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867185.51514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867185.51543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867185.51558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867185.51586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867185.51661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867185.53456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867185.53460: stdout chunk (state=3): >>><<< 13131 1726867185.53462: stderr chunk (state=3): >>><<< 13131 1726867185.53480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867185.53560: _low_level_execute_command(): starting 13131 1726867185.53564: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/AnsiballZ_setup.py && sleep 0' 13131 1726867185.54053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867185.54153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867185.54166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867185.54250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867185.56422: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 13131 1726867185.56472: stdout chunk (state=3): >>>import _imp # builtin <<< 13131 1726867185.56487: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 13131 1726867185.56555: stdout chunk (state=3): >>>import '_io' # <<< 13131 1726867185.56581: stdout chunk (state=3): >>>import 'marshal' # <<< 13131 1726867185.56603: stdout chunk (state=3): >>>import 'posix' # <<< 13131 1726867185.56631: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13131 1726867185.56661: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 13131 1726867185.56737: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 13131 1726867185.56750: stdout chunk (state=3): >>>import 'codecs' # <<< 13131 1726867185.56796: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 13131 1726867185.56826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd99e7b30> <<< 13131 1726867185.56853: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 13131 1726867185.56868: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9a1aa50> <<< 13131 1726867185.56924: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 13131 1726867185.56927: stdout chunk (state=3): >>>import 'abc' # <<< 13131 1726867185.56963: stdout chunk (state=3): >>>import 'io' # <<< 13131 1726867185.56966: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 13131 1726867185.57051: stdout chunk (state=3): >>>import '_collections_abc' # <<< 13131 1726867185.57082: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 13131 1726867185.57122: stdout chunk (state=3): >>>import 'os' # <<< 13131 1726867185.57157: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 13131 1726867185.57190: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 13131 1726867185.57207: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 13131 1726867185.57238: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd982d130> <<< 13131 1726867185.57291: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 13131 1726867185.57320: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd982dfa0> <<< 13131 1726867185.57331: stdout chunk (state=3): >>>import 'site' # <<< 13131 1726867185.57350: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13131 1726867185.58061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13131 1726867185.58101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 13131 1726867185.58105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 13131 1726867185.58115: stdout chunk (state=3): >>> <<< 13131 1726867185.58171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 13131 1726867185.58199: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd986be00> <<< 13131 1726867185.58219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 13131 1726867185.58251: stdout chunk (state=3): >>> <<< 13131 1726867185.58262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 13131 1726867185.58302: stdout chunk (state=3): >>> import '_operator' # <<< 13131 1726867185.58349: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd986bec0><<< 13131 1726867185.58352: stdout chunk (state=3): >>> <<< 13131 1726867185.58363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 13131 1726867185.58409: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 13131 1726867185.58420: stdout chunk (state=3): >>> <<< 13131 1726867185.58453: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 13131 1726867185.58613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 13131 1726867185.58647: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98a37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 13131 1726867185.58681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98a3e60><<< 13131 1726867185.58711: stdout chunk (state=3): >>> import '_collections' # <<< 13131 1726867185.58738: stdout chunk (state=3): >>> <<< 13131 1726867185.58788: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9883ad0> <<< 13131 1726867185.58818: stdout chunk (state=3): >>>import '_functools' # <<< 13131 1726867185.58867: stdout chunk (state=3): >>> import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98811f0><<< 13131 1726867185.58873: stdout chunk (state=3): >>> <<< 13131 1726867185.59025: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9868fb0> <<< 13131 1726867185.59092: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc'<<< 13131 1726867185.59095: stdout chunk (state=3): >>> <<< 13131 1726867185.59120: stdout chunk (state=3): >>>import '_sre' # <<< 13131 1726867185.59141: stdout chunk (state=3): >>> <<< 13131 1726867185.59197: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 13131 1726867185.59203: stdout chunk (state=3): >>> <<< 13131 1726867185.59232: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 13131 1726867185.59255: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 13131 1726867185.59304: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98c3770><<< 13131 1726867185.59309: stdout chunk (state=3): >>> <<< 13131 1726867185.59341: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98c2390><<< 13131 1726867185.59346: stdout chunk (state=3): >>> <<< 13131 1726867185.59384: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc'<<< 13131 1726867185.59405: stdout chunk (state=3): >>> import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9882090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98c0bc0><<< 13131 1726867185.59480: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 13131 1726867185.59487: stdout chunk (state=3): >>> <<< 13131 1726867185.59508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 13131 1726867185.59521: stdout chunk (state=3): >>> <<< 13131 1726867185.59527: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98f8800> <<< 13131 1726867185.59545: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9868230><<< 13131 1726867185.59580: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 13131 1726867185.59631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 13131 1726867185.59636: stdout chunk (state=3): >>> <<< 13131 1726867185.59659: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.59719: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd98f8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98f8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.59743: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.59768: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd98f8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9866d50><<< 13131 1726867185.59817: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 13131 1726867185.59829: stdout chunk (state=3): >>> <<< 13131 1726867185.59841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867185.59949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98f9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98f9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 13131 1726867185.59959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 13131 1726867185.60004: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98fa480> import 'importlib.util' # <<< 13131 1726867185.60039: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 13131 1726867185.60080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 13131 1726867185.60107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd99106b0> <<< 13131 1726867185.60144: stdout chunk (state=3): >>>import 'errno' # <<< 13131 1726867185.60180: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9911d90> <<< 13131 1726867185.60184: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 13131 1726867185.60215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 13131 1726867185.60234: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9912c30> <<< 13131 1726867185.60292: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9913290> <<< 13131 1726867185.60318: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9912180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 13131 1726867185.60370: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.60385: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9913d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9913440> <<< 13131 1726867185.60438: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98fa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 13131 1726867185.60475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 13131 1726867185.60524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 13131 1726867185.60597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9607bc0> <<< 13131 1726867185.60654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd96306e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9630440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9630620> <<< 13131 1726867185.60800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.60867: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9630fe0> <<< 13131 1726867185.61071: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9631970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9630890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9605d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 13131 1726867185.61074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 13131 1726867185.61096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9632cf0> <<< 13131 1726867185.61130: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9630e60> <<< 13131 1726867185.61297: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98fabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 13131 1726867185.61319: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd965f020> <<< 13131 1726867185.61387: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 13131 1726867185.61400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867185.61594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96833e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 13131 1726867185.61766: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96e01a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13131 1726867185.61798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13131 1726867185.62034: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96e28d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96e02c0><<< 13131 1726867185.62059: stdout chunk (state=3): >>> <<< 13131 1726867185.62100: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96ad190> <<< 13131 1726867185.62150: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 13131 1726867185.62185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd94f11f0> <<< 13131 1726867185.62236: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96821e0> <<< 13131 1726867185.62239: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9633bf0><<< 13131 1726867185.62242: stdout chunk (state=3): >>> <<< 13131 1726867185.62524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 13131 1726867185.62556: stdout chunk (state=3): >>> import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6fd96828a0> <<< 13131 1726867185.62980: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_qfyhkfq9/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 13131 1726867185.63226: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 13131 1726867185.63229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13131 1726867185.63287: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 13131 1726867185.63425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9552f00> <<< 13131 1726867185.63428: stdout chunk (state=3): >>>import '_typing' # <<< 13131 1726867185.63693: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9531df0> <<< 13131 1726867185.63713: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9530f50> # zipimport: zlib available <<< 13131 1726867185.63750: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 13131 1726867185.63789: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.63796: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 13131 1726867185.63839: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.65439: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.66569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9550dd0> <<< 13131 1726867185.66598: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867185.66638: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 13131 1726867185.66643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 13131 1726867185.66682: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 13131 1726867185.66692: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.66718: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd958a780> <<< 13131 1726867185.66733: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd958a510> <<< 13131 1726867185.66762: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9589e20> <<< 13131 1726867185.66799: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 13131 1726867185.66803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 13131 1726867185.66869: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd958a840> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9553b90> import 'atexit' # <<< 13131 1726867185.66883: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.66932: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd958b4a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd958b650> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13131 1726867185.67054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd958bb90> import 'pwd' # <<< 13131 1726867185.67189: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f2d970> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f2f590> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 13131 1726867185.67199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 13131 1726867185.67218: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f2ff20> <<< 13131 1726867185.67240: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 13131 1726867185.67271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 13131 1726867185.67339: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f35100> <<< 13131 1726867185.67366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 13131 1726867185.67430: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f37bc0> <<< 13131 1726867185.67492: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9532ff0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f35e80> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13131 1726867185.67550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 13131 1726867185.67561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 13131 1726867185.67664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13131 1726867185.67812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f3bb90> <<< 13131 1726867185.67815: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f3a660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f3a3c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13131 1726867185.67898: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f3a930> <<< 13131 1726867185.67912: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f36390> <<< 13131 1726867185.67941: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f7fe90> <<< 13131 1726867185.67986: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f7f890> <<< 13131 1726867185.68021: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 13131 1726867185.68047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 13131 1726867185.68196: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.68218: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f81a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f81820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f83fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f82150> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13131 1726867185.68384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f87680> <<< 13131 1726867185.68706: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f84140> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f88a10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f888c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f880b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.68710: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8e14170> <<< 13131 1726867185.68882: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.68888: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8e153a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f8a8d0> <<< 13131 1726867185.68941: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f8bc50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f8a4e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 13131 1726867185.68962: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.69043: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.69137: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.69190: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 13131 1726867185.69336: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.69426: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.69962: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.70480: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 13131 1726867185.70510: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 13131 1726867185.70542: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867185.70590: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8e194c0> <<< 13131 1726867185.70759: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 13131 1726867185.70833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e1a1e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e155b0> import 'ansible.module_utils.compat.selinux' # <<< 13131 1726867185.70879: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 13131 1726867185.71098: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.71294: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e1a180> # zipimport: zlib available <<< 13131 1726867185.71570: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.72441: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 13131 1726867185.72482: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.72536: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 13131 1726867185.72539: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.72876: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.72976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13131 1726867185.73024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 13131 1726867185.73071: stdout chunk (state=3): >>>import '_ast' # <<< 13131 1726867185.73106: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e1b2f0> <<< 13131 1726867185.73127: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.73274: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.73283: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 13131 1726867185.73319: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 13131 1726867185.73342: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.73442: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 13131 1726867185.73445: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.73599: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.73602: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13131 1726867185.73648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867185.73868: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8e25d60> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e215b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.73919: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.73966: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.74117: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 13131 1726867185.74120: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13131 1726867185.74123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13131 1726867185.74203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 13131 1726867185.74206: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f0e780> <<< 13131 1726867185.74242: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8ffa480> <<< 13131 1726867185.74382: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e25dc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f88e60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 13131 1726867185.74444: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 13131 1726867185.74505: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.74684: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 13131 1726867185.74687: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.74703: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.74742: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.75024: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.75028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 13131 1726867185.75206: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.75381: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.75413: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.75689: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867185.75693: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 13131 1726867185.75710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eb9e50> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 13131 1726867185.75790: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a3bdd0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8a403e0> <<< 13131 1726867185.75881: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8ea2c30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eba9c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eb8530> <<< 13131 1726867185.75900: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eb8f20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 13131 1726867185.75998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 13131 1726867185.76098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8a430b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a42990> <<< 13131 1726867185.76290: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8a42b40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a41e20> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 13131 1726867185.76317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a431a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8aadca0> <<< 13131 1726867185.76366: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a43c80> <<< 13131 1726867185.76519: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eb81d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.76601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 13131 1726867185.76604: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.76805: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.76903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 13131 1726867185.76906: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.76946: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 13131 1726867185.76966: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.77189: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # <<< 13131 1726867185.77399: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 13131 1726867185.77658: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.78085: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 13131 1726867185.78143: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.78195: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.78229: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.78259: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 13131 1726867185.78520: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 13131 1726867185.78550: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 13131 1726867185.78569: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.78599: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 13131 1726867185.78686: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.78776: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 13131 1726867185.78815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8aad9d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 13131 1726867185.78847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 13131 1726867185.78981: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8aae990> import 'ansible.module_utils.facts.system.local' # <<< 13131 1726867185.78984: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.79038: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.79175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 13131 1726867185.79200: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.79289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 13131 1726867185.79309: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.79399: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.79443: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 13131 1726867185.79446: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.79485: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.79523: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 13131 1726867185.79571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 13131 1726867185.79642: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.79730: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8addfa0> <<< 13131 1726867185.79889: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8acdd30> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 13131 1726867185.79950: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.80008: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 13131 1726867185.80019: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.80091: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.80174: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.80295: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.80505: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 13131 1726867185.80509: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.80536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 13131 1726867185.80552: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.80583: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.80713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 13131 1726867185.80727: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8af1820> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8acf0b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 13131 1726867185.80751: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.80798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 13131 1726867185.80819: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81008: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81102: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 13131 1726867185.81127: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81213: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81310: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81351: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81394: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 13131 1726867185.81420: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.81457: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81588: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 13131 1726867185.81782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 13131 1726867185.81856: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.81992: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 13131 1726867185.82106: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.82608: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.83106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 13131 1726867185.83258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 13131 1726867185.83261: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.83324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 13131 1726867185.83349: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.83434: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.83527: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 13131 1726867185.83943: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.84018: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 13131 1726867185.84045: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 13131 1726867185.84057: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.84121: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.84170: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 13131 1726867185.84194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.84336: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.84487: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.84801: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.85119: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 13131 1726867185.85141: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.85227: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.85247: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 13131 1726867185.85318: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 13131 1726867185.85331: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.85442: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.85556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.85602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 13131 1726867185.85605: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.85687: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.85880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.85944: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 13131 1726867185.85948: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.86483: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.86988: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 13131 1726867185.87001: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.87004: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 13131 1726867185.87113: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.87116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 13131 1726867185.87118: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.87401: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.87409: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 13131 1726867185.87411: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.87414: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 13131 1726867185.87532: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13131 1726867185.87535: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.87594: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.87675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 13131 1726867185.87694: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.87968: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 13131 1726867185.88193: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.88507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 13131 1726867185.88525: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.88606: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.88664: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 13131 1726867185.88692: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.88840: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 13131 1726867185.88965: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.89089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 13131 1726867185.89112: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 13131 1726867185.89133: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.89283: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.89424: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 13131 1726867185.89427: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # <<< 13131 1726867185.89447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 13131 1726867185.89649: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867185.90951: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 13131 1726867185.91016: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867185.91019: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8889f10> <<< 13131 1726867185.91042: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8889310> <<< 13131 1726867185.91131: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8883b30> <<< 13131 1726867186.08430: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 13131 1726867186.08448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 13131 1726867186.08506: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd88d0ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 13131 1726867186.08529: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 13131 1726867186.08617: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd88d1a00> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 13131 1726867186.08621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867186.08689: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 13131 1726867186.08694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd891fd40> <<< 13131 1726867186.08721: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd88d3e00> <<< 13131 1726867186.09093: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 13131 1726867186.29292: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "<<< 13131 1726867186.29385: stdout chunk (state=3): >>>hour": "17", "minute": "19", "second": "45", "epoch": "1726867185", "epoch_int": "1726867185", "date": "2024-09-20", "time": "17:19:45", "iso8601_micro": "2024-09-20T21:19:45.917403Z", "iso8601": "2024-09-20T21:19:45Z", "iso8601_basic": "20240920T171945917403", "iso8601_basic_short": "20240920T171945", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.54150390625, "5m": 0.302734375, "15m": 0.1416015625}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 431, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796614144, "block_size": 4096, "block_total": 65519099, "block_available": 63915189, "block_used": 1603910, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13131 1726867186.29984: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings <<< 13131 1726867186.30123: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic <<< 13131 1726867186.30288: stdout chunk (state=3): >>># destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ <<< 13131 1726867186.30294: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr <<< 13131 1726867186.30309: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 13131 1726867186.30631: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13131 1726867186.30732: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 13131 1726867186.30782: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 13131 1726867186.30796: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 13131 1726867186.30894: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 13131 1726867186.30926: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 13131 1726867186.30963: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl <<< 13131 1726867186.31144: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct <<< 13131 1726867186.31299: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 13131 1726867186.31303: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 13131 1726867186.31369: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13131 1726867186.31480: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 13131 1726867186.31508: stdout chunk (state=3): >>># destroy _collections <<< 13131 1726867186.31533: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 13131 1726867186.31561: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 13131 1726867186.31719: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 13131 1726867186.31740: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 13131 1726867186.31758: stdout chunk (state=3): >>># destroy time <<< 13131 1726867186.31785: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 13131 1726867186.31832: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 13131 1726867186.31844: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 13131 1726867186.32208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867186.32401: stderr chunk (state=3): >>><<< 13131 1726867186.32404: stdout chunk (state=3): >>><<< 13131 1726867186.32749: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd99e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9a1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd982d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd982dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd986be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd986bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98a37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98a3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9883ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98811f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9868fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98c3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98c2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9882090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98f8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9868230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd98f8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98f8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd98f8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9866d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98f9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98f9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98fa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd99106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9911d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9912c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9913290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9912180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9913d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9913440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98fa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9607bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd96306e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9630440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9630620> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9630fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9631970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9630890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9605d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9632cf0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9630e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd98fabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd965f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96833e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96e01a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96e28d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96e02c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96ad190> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd94f11f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd96821e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9633bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6fd96828a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_qfyhkfq9/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9552f00> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9531df0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9530f50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9550dd0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd958a780> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd958a510> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9589e20> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd958a840> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd9553b90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd958b4a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd958b650> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd958bb90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f2d970> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f2f590> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f2ff20> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f35100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f37bc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd9532ff0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f35e80> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f3bb90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f3a660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f3a3c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f3a930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f36390> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f7fe90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f7f890> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f81a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f81820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f83fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f82150> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f87680> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f84140> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f88a10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f888c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f880b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8e14170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8e153a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f8a8d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8f8bc50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f8a4e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8e194c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e1a1e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e155b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e1a180> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e1b2f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8e25d60> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e215b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f0e780> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8ffa480> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8e25dc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8f88e60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eb9e50> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a3bdd0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8a403e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8ea2c30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eba9c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eb8530> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eb8f20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8a430b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a42990> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8a42b40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a41e20> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a431a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8aadca0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8a43c80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8eb81d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8aad9d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8aae990> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8addfa0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8acdd30> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8af1820> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8acf0b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6fd8889f10> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8889310> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd8883b30> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd88d0ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd88d1a00> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd891fd40> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6fd88d3e00> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "45", "epoch": "1726867185", "epoch_int": "1726867185", "date": "2024-09-20", "time": "17:19:45", "iso8601_micro": "2024-09-20T21:19:45.917403Z", "iso8601": "2024-09-20T21:19:45Z", "iso8601_basic": "20240920T171945917403", "iso8601_basic_short": "20240920T171945", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.54150390625, "5m": 0.302734375, "15m": 0.1416015625}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 431, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796614144, "block_size": 4096, "block_total": 65519099, "block_available": 63915189, "block_used": 1603910, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 13131 1726867186.34524: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867186.34527: _low_level_execute_command(): starting 13131 1726867186.34529: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867184.9720309-13187-279519658609764/ > /dev/null 2>&1 && sleep 0' 13131 1726867186.35153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867186.35163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867186.35173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867186.35257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867186.35261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867186.35263: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867186.35266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.35268: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867186.35270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867186.35272: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867186.35275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867186.35289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.35365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867186.35373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867186.35448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867186.37596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867186.37600: stdout chunk (state=3): >>><<< 13131 1726867186.37635: stderr chunk (state=3): >>><<< 13131 1726867186.37639: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867186.37641: handler run complete 13131 1726867186.37849: variable 'ansible_facts' from source: unknown 13131 1726867186.38081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867186.39483: variable 'ansible_facts' from source: unknown 13131 1726867186.39512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867186.39642: attempt loop complete, returning result 13131 1726867186.39645: _execute() done 13131 1726867186.39648: dumping result to json 13131 1726867186.39698: done dumping result, returning 13131 1726867186.39706: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcac9-a3a5-5f24-9b7a-0000000001bc] 13131 1726867186.39710: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001bc 13131 1726867186.40331: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001bc 13131 1726867186.40334: WORKER PROCESS EXITING ok: [managed_node1] 13131 1726867186.40960: no more pending results, returning what we have 13131 1726867186.40964: results queue empty 13131 1726867186.40965: checking for any_errors_fatal 13131 1726867186.40966: done checking for any_errors_fatal 13131 1726867186.40967: checking for max_fail_percentage 13131 1726867186.40968: done checking for max_fail_percentage 13131 1726867186.40969: checking to see if all hosts have failed and the running result is not ok 13131 1726867186.40970: done checking to see if all hosts have failed 13131 1726867186.40971: getting the remaining hosts for this loop 13131 1726867186.40972: done getting the remaining hosts for this loop 13131 1726867186.40976: getting the next task for host managed_node1 13131 1726867186.40984: done getting next task for host managed_node1 13131 1726867186.40986: ^ task is: TASK: meta (flush_handlers) 13131 1726867186.40988: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867186.40992: getting variables 13131 1726867186.40993: in VariableManager get_vars() 13131 1726867186.41016: Calling all_inventory to load vars for managed_node1 13131 1726867186.41018: Calling groups_inventory to load vars for managed_node1 13131 1726867186.41021: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867186.41030: Calling all_plugins_play to load vars for managed_node1 13131 1726867186.41032: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867186.41035: Calling groups_plugins_play to load vars for managed_node1 13131 1726867186.41620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867186.41975: done with get_vars() 13131 1726867186.41992: done getting variables 13131 1726867186.42182: in VariableManager get_vars() 13131 1726867186.42306: Calling all_inventory to load vars for managed_node1 13131 1726867186.42309: Calling groups_inventory to load vars for managed_node1 13131 1726867186.42311: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867186.42316: Calling all_plugins_play to load vars for managed_node1 13131 1726867186.42329: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867186.42334: Calling groups_plugins_play to load vars for managed_node1 13131 1726867186.42558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867186.42945: done with get_vars() 13131 1726867186.42958: done queuing things up, now waiting for results queue to drain 13131 1726867186.42960: results queue empty 13131 1726867186.42960: checking for any_errors_fatal 13131 1726867186.42962: done checking for any_errors_fatal 13131 1726867186.42963: checking for max_fail_percentage 13131 1726867186.42964: done checking for max_fail_percentage 13131 1726867186.42965: checking to see if all hosts have failed and the running result is not ok 13131 1726867186.42965: done checking to see if all hosts have failed 13131 1726867186.42966: getting the remaining hosts for this loop 13131 1726867186.42967: done getting the remaining hosts for this loop 13131 1726867186.42969: getting the next task for host managed_node1 13131 1726867186.42973: done getting next task for host managed_node1 13131 1726867186.42975: ^ task is: TASK: Include the task 'el_repo_setup.yml' 13131 1726867186.42979: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867186.42981: getting variables 13131 1726867186.42982: in VariableManager get_vars() 13131 1726867186.42988: Calling all_inventory to load vars for managed_node1 13131 1726867186.42990: Calling groups_inventory to load vars for managed_node1 13131 1726867186.42992: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867186.43003: Calling all_plugins_play to load vars for managed_node1 13131 1726867186.43006: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867186.43009: Calling groups_plugins_play to load vars for managed_node1 13131 1726867186.43146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867186.43339: done with get_vars() 13131 1726867186.43346: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:11 Friday 20 September 2024 17:19:46 -0400 (0:00:01.521) 0:00:01.544 ****** 13131 1726867186.43414: entering _queue_task() for managed_node1/include_tasks 13131 1726867186.43416: Creating lock for include_tasks 13131 1726867186.43835: worker is 1 (out of 1 available) 13131 1726867186.43846: exiting _queue_task() for managed_node1/include_tasks 13131 1726867186.43856: done queuing things up, now waiting for results queue to drain 13131 1726867186.43858: waiting for pending results... 13131 1726867186.44896: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 13131 1726867186.45186: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000006 13131 1726867186.45189: variable 'ansible_search_path' from source: unknown 13131 1726867186.45192: calling self._execute() 13131 1726867186.45223: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867186.45300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867186.45416: variable 'omit' from source: magic vars 13131 1726867186.45611: _execute() done 13131 1726867186.45724: dumping result to json 13131 1726867186.45789: done dumping result, returning 13131 1726867186.45793: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-5f24-9b7a-000000000006] 13131 1726867186.45796: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000006 13131 1726867186.46317: no more pending results, returning what we have 13131 1726867186.46321: in VariableManager get_vars() 13131 1726867186.46348: Calling all_inventory to load vars for managed_node1 13131 1726867186.46351: Calling groups_inventory to load vars for managed_node1 13131 1726867186.46355: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867186.46366: Calling all_plugins_play to load vars for managed_node1 13131 1726867186.46369: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867186.46372: Calling groups_plugins_play to load vars for managed_node1 13131 1726867186.46771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867186.47111: done with get_vars() 13131 1726867186.47118: variable 'ansible_search_path' from source: unknown 13131 1726867186.47246: we have included files to process 13131 1726867186.47247: generating all_blocks data 13131 1726867186.47249: done generating all_blocks data 13131 1726867186.47249: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13131 1726867186.47251: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13131 1726867186.47256: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000006 13131 1726867186.47259: WORKER PROCESS EXITING 13131 1726867186.47262: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13131 1726867186.48634: in VariableManager get_vars() 13131 1726867186.48655: done with get_vars() 13131 1726867186.48666: done processing included file 13131 1726867186.48669: iterating over new_blocks loaded from include file 13131 1726867186.48670: in VariableManager get_vars() 13131 1726867186.48683: done with get_vars() 13131 1726867186.48685: filtering new block on tags 13131 1726867186.48702: done filtering new block on tags 13131 1726867186.48705: in VariableManager get_vars() 13131 1726867186.48715: done with get_vars() 13131 1726867186.48717: filtering new block on tags 13131 1726867186.48731: done filtering new block on tags 13131 1726867186.48734: in VariableManager get_vars() 13131 1726867186.48743: done with get_vars() 13131 1726867186.48744: filtering new block on tags 13131 1726867186.48762: done filtering new block on tags 13131 1726867186.48764: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 13131 1726867186.48770: extending task lists for all hosts with included blocks 13131 1726867186.48823: done extending task lists 13131 1726867186.48824: done processing included files 13131 1726867186.48825: results queue empty 13131 1726867186.48825: checking for any_errors_fatal 13131 1726867186.48826: done checking for any_errors_fatal 13131 1726867186.48826: checking for max_fail_percentage 13131 1726867186.48827: done checking for max_fail_percentage 13131 1726867186.48828: checking to see if all hosts have failed and the running result is not ok 13131 1726867186.48830: done checking to see if all hosts have failed 13131 1726867186.48831: getting the remaining hosts for this loop 13131 1726867186.48832: done getting the remaining hosts for this loop 13131 1726867186.48834: getting the next task for host managed_node1 13131 1726867186.48839: done getting next task for host managed_node1 13131 1726867186.48841: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 13131 1726867186.48843: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867186.48846: getting variables 13131 1726867186.48847: in VariableManager get_vars() 13131 1726867186.48861: Calling all_inventory to load vars for managed_node1 13131 1726867186.48862: Calling groups_inventory to load vars for managed_node1 13131 1726867186.48864: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867186.48868: Calling all_plugins_play to load vars for managed_node1 13131 1726867186.48869: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867186.48870: Calling groups_plugins_play to load vars for managed_node1 13131 1726867186.48971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867186.49087: done with get_vars() 13131 1726867186.49096: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:19:46 -0400 (0:00:00.057) 0:00:01.601 ****** 13131 1726867186.49139: entering _queue_task() for managed_node1/setup 13131 1726867186.49339: worker is 1 (out of 1 available) 13131 1726867186.49351: exiting _queue_task() for managed_node1/setup 13131 1726867186.49361: done queuing things up, now waiting for results queue to drain 13131 1726867186.49362: waiting for pending results... 13131 1726867186.49506: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 13131 1726867186.49561: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001cd 13131 1726867186.49571: variable 'ansible_search_path' from source: unknown 13131 1726867186.49574: variable 'ansible_search_path' from source: unknown 13131 1726867186.49609: calling self._execute() 13131 1726867186.49664: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867186.49668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867186.49676: variable 'omit' from source: magic vars 13131 1726867186.50051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867186.52931: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867186.52984: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867186.53006: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867186.53031: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867186.53050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867186.53112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867186.53132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867186.53149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867186.53174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867186.53188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867186.53312: variable 'ansible_facts' from source: unknown 13131 1726867186.53351: variable 'network_test_required_facts' from source: task vars 13131 1726867186.53382: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 13131 1726867186.53386: when evaluation is False, skipping this task 13131 1726867186.53388: _execute() done 13131 1726867186.53394: dumping result to json 13131 1726867186.53397: done dumping result, returning 13131 1726867186.53402: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-5f24-9b7a-0000000001cd] 13131 1726867186.53407: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001cd 13131 1726867186.53495: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001cd 13131 1726867186.53499: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 13131 1726867186.53589: no more pending results, returning what we have 13131 1726867186.53592: results queue empty 13131 1726867186.53593: checking for any_errors_fatal 13131 1726867186.53594: done checking for any_errors_fatal 13131 1726867186.53595: checking for max_fail_percentage 13131 1726867186.53597: done checking for max_fail_percentage 13131 1726867186.53597: checking to see if all hosts have failed and the running result is not ok 13131 1726867186.53598: done checking to see if all hosts have failed 13131 1726867186.53599: getting the remaining hosts for this loop 13131 1726867186.53600: done getting the remaining hosts for this loop 13131 1726867186.53603: getting the next task for host managed_node1 13131 1726867186.53610: done getting next task for host managed_node1 13131 1726867186.53612: ^ task is: TASK: Check if system is ostree 13131 1726867186.53615: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867186.53617: getting variables 13131 1726867186.53619: in VariableManager get_vars() 13131 1726867186.53644: Calling all_inventory to load vars for managed_node1 13131 1726867186.53646: Calling groups_inventory to load vars for managed_node1 13131 1726867186.53649: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867186.53657: Calling all_plugins_play to load vars for managed_node1 13131 1726867186.53659: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867186.53661: Calling groups_plugins_play to load vars for managed_node1 13131 1726867186.53773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867186.53910: done with get_vars() 13131 1726867186.53917: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:19:46 -0400 (0:00:00.048) 0:00:01.650 ****** 13131 1726867186.53973: entering _queue_task() for managed_node1/stat 13131 1726867186.54155: worker is 1 (out of 1 available) 13131 1726867186.54167: exiting _queue_task() for managed_node1/stat 13131 1726867186.54180: done queuing things up, now waiting for results queue to drain 13131 1726867186.54182: waiting for pending results... 13131 1726867186.54321: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 13131 1726867186.54375: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001cf 13131 1726867186.54426: variable 'ansible_search_path' from source: unknown 13131 1726867186.54429: variable 'ansible_search_path' from source: unknown 13131 1726867186.54445: calling self._execute() 13131 1726867186.54556: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867186.54559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867186.54562: variable 'omit' from source: magic vars 13131 1726867186.54984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867186.55207: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867186.55272: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867186.55310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867186.55339: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867186.55406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867186.55423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867186.55448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867186.55466: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867186.55565: Evaluated conditional (not __network_is_ostree is defined): True 13131 1726867186.55568: variable 'omit' from source: magic vars 13131 1726867186.55600: variable 'omit' from source: magic vars 13131 1726867186.55624: variable 'omit' from source: magic vars 13131 1726867186.55645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867186.55673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867186.55683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867186.55699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867186.55708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867186.55729: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867186.55732: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867186.55734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867186.55807: Set connection var ansible_connection to ssh 13131 1726867186.55814: Set connection var ansible_timeout to 10 13131 1726867186.55817: Set connection var ansible_shell_type to sh 13131 1726867186.55824: Set connection var ansible_shell_executable to /bin/sh 13131 1726867186.55831: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867186.55836: Set connection var ansible_pipelining to False 13131 1726867186.55851: variable 'ansible_shell_executable' from source: unknown 13131 1726867186.55854: variable 'ansible_connection' from source: unknown 13131 1726867186.55857: variable 'ansible_module_compression' from source: unknown 13131 1726867186.55859: variable 'ansible_shell_type' from source: unknown 13131 1726867186.55861: variable 'ansible_shell_executable' from source: unknown 13131 1726867186.55863: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867186.55868: variable 'ansible_pipelining' from source: unknown 13131 1726867186.55870: variable 'ansible_timeout' from source: unknown 13131 1726867186.55874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867186.55975: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867186.55984: variable 'omit' from source: magic vars 13131 1726867186.55989: starting attempt loop 13131 1726867186.55991: running the handler 13131 1726867186.56009: _low_level_execute_command(): starting 13131 1726867186.56016: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867186.56489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867186.56495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.56498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867186.56500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.56556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867186.56559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867186.56563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867186.56605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867186.58836: stdout chunk (state=3): >>>/root <<< 13131 1726867186.58981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867186.59008: stderr chunk (state=3): >>><<< 13131 1726867186.59012: stdout chunk (state=3): >>><<< 13131 1726867186.59037: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867186.59047: _low_level_execute_command(): starting 13131 1726867186.59050: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731 `" && echo ansible-tmp-1726867186.590323-13282-162022357317731="` echo /root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731 `" ) && sleep 0' 13131 1726867186.59469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867186.59472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867186.59475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867186.59478: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867186.59481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.59528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867186.59531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867186.59588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867186.62210: stdout chunk (state=3): >>>ansible-tmp-1726867186.590323-13282-162022357317731=/root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731 <<< 13131 1726867186.62327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867186.62349: stderr chunk (state=3): >>><<< 13131 1726867186.62352: stdout chunk (state=3): >>><<< 13131 1726867186.62366: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867186.590323-13282-162022357317731=/root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867186.62410: variable 'ansible_module_compression' from source: unknown 13131 1726867186.62455: ANSIBALLZ: Using lock for stat 13131 1726867186.62458: ANSIBALLZ: Acquiring lock 13131 1726867186.62461: ANSIBALLZ: Lock acquired: 140192902125936 13131 1726867186.62463: ANSIBALLZ: Creating module 13131 1726867186.70435: ANSIBALLZ: Writing module into payload 13131 1726867186.70497: ANSIBALLZ: Writing module 13131 1726867186.70513: ANSIBALLZ: Renaming module 13131 1726867186.70519: ANSIBALLZ: Done creating module 13131 1726867186.70540: variable 'ansible_facts' from source: unknown 13131 1726867186.70584: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/AnsiballZ_stat.py 13131 1726867186.70685: Sending initial data 13131 1726867186.70693: Sent initial data (152 bytes) 13131 1726867186.71124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867186.71127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867186.71129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.71131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867186.71133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.71183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867186.71188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867186.71264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867186.73518: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867186.73523: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867186.73581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867186.73637: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmphzg3u3v0 /root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/AnsiballZ_stat.py <<< 13131 1726867186.73641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/AnsiballZ_stat.py" <<< 13131 1726867186.73694: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmphzg3u3v0" to remote "/root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/AnsiballZ_stat.py" <<< 13131 1726867186.73700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/AnsiballZ_stat.py" <<< 13131 1726867186.74262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867186.74300: stderr chunk (state=3): >>><<< 13131 1726867186.74304: stdout chunk (state=3): >>><<< 13131 1726867186.74344: done transferring module to remote 13131 1726867186.74356: _low_level_execute_command(): starting 13131 1726867186.74359: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/ /root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/AnsiballZ_stat.py && sleep 0' 13131 1726867186.74770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867186.74774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.74776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867186.74780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.74829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867186.74832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867186.74885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867186.77153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867186.77173: stderr chunk (state=3): >>><<< 13131 1726867186.77176: stdout chunk (state=3): >>><<< 13131 1726867186.77190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867186.77199: _low_level_execute_command(): starting 13131 1726867186.77202: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/AnsiballZ_stat.py && sleep 0' 13131 1726867186.77601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867186.77604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.77607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867186.77609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867186.77678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867186.77743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867186.80179: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 13131 1726867186.80313: stdout chunk (state=3): >>>import _imp # builtin <<< 13131 1726867186.80317: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 13131 1726867186.80505: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867186.80510: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8bbc4d0> <<< 13131 1726867186.80867: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8b8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8bbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 13131 1726867186.80871: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 13131 1726867186.81095: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8bcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8bcdfa0> <<< 13131 1726867186.81104: stdout chunk (state=3): >>>import 'site' # <<< 13131 1726867186.81129: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13131 1726867186.81254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 13131 1726867186.81284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 13131 1726867186.81335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13131 1726867186.81366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 13131 1726867186.81383: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 13131 1726867186.81658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89abe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 13131 1726867186.81701: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89abf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89e3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89e3f20> import '_collections' # <<< 13131 1726867186.81748: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89c3b60> <<< 13131 1726867186.81773: stdout chunk (state=3): >>>import '_functools' # <<< 13131 1726867186.81806: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89c1280> <<< 13131 1726867186.82158: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89a9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 13131 1726867186.82164: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a03800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a02420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89c2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a00b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a38860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89a82c0> <<< 13131 1726867186.82198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 13131 1726867186.82222: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a38d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a38bc0> <<< 13131 1726867186.82268: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a38f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89a6de0> <<< 13131 1726867186.82303: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867186.82496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a39610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a392e0> import 'importlib.machinery' # <<< 13131 1726867186.82597: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a3a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a50710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a51df0> <<< 13131 1726867186.82652: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 13131 1726867186.82666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a52c90> <<< 13131 1726867186.82707: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a521e0> <<< 13131 1726867186.82750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 13131 1726867186.82953: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a53d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a534a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a3a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 13131 1726867186.82996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c87e3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 13131 1726867186.83088: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c880c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c880c590> <<< 13131 1726867186.83098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 13131 1726867186.83210: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867186.83316: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c880d010> <<< 13131 1726867186.83497: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c880d9d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c87e1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 13131 1726867186.83500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 13131 1726867186.83574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880ede0> <<< 13131 1726867186.83623: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880db20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a3ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 13131 1726867186.83702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867186.83705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 13131 1726867186.83916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c883b140> <<< 13131 1726867186.83932: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c885b4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 13131 1726867186.83994: stdout chunk (state=3): >>>import 'ntpath' # <<< 13131 1726867186.84028: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c88bc200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 13131 1726867186.84057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 13131 1726867186.84080: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13131 1726867186.84134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13131 1726867186.84206: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c88be960> <<< 13131 1726867186.84293: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c88bc320> <<< 13131 1726867186.84315: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8881250> <<< 13131 1726867186.84518: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81252e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c885a2d0> <<< 13131 1726867186.84564: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880fd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff7c885a630> <<< 13131 1726867186.84676: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_2_jijmxv/ansible_stat_payload.zip' # zipimport: zlib available <<< 13131 1726867186.84956: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13131 1726867186.85003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 13131 1726867186.85071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13131 1726867186.85119: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c817af90> import '_typing' # <<< 13131 1726867186.85395: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8159e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8158fe0> <<< 13131 1726867186.85420: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.85451: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 13131 1726867186.85487: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13131 1726867186.85494: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 13131 1726867186.85581: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.87626: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.89264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8178ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867186.89325: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 13131 1726867186.89330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 13131 1726867186.89414: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867186.89420: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c81aa960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81aa6f0> <<< 13131 1726867186.89457: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81aa000> <<< 13131 1726867186.89488: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 13131 1726867186.89494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 13131 1726867186.89545: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81aa540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c817ba10> <<< 13131 1726867186.89562: stdout chunk (state=3): >>>import 'atexit' # <<< 13131 1726867186.89632: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c81ab650> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867186.89638: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c81ab890> <<< 13131 1726867186.89654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13131 1726867186.89723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 13131 1726867186.89799: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81abda0> <<< 13131 1726867186.89814: stdout chunk (state=3): >>>import 'pwd' # <<< 13131 1726867186.89835: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 13131 1726867186.89868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 13131 1726867186.89918: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c800db80> <<< 13131 1726867186.89950: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c800f7a0> <<< 13131 1726867186.89998: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 13131 1726867186.90046: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8010170> <<< 13131 1726867186.90097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 13131 1726867186.90102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 13131 1726867186.90153: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8011310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 13131 1726867186.90196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 13131 1726867186.90231: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 13131 1726867186.90350: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8013dd0> <<< 13131 1726867186.90353: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8013ef0> <<< 13131 1726867186.90381: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8012090> <<< 13131 1726867186.90403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13131 1726867186.90465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 13131 1726867186.90524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 13131 1726867186.90558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 13131 1726867186.90574: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c801be00> <<< 13131 1726867186.90654: stdout chunk (state=3): >>>import '_tokenize' # <<< 13131 1726867186.90709: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c801a8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c801a630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 13131 1726867186.90714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13131 1726867186.90841: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c801aba0> <<< 13131 1726867186.90903: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80125a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8063a40> <<< 13131 1726867186.90964: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8064140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 13131 1726867186.91011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 13131 1726867186.91057: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867186.91062: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8065be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80659a0> <<< 13131 1726867186.91138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 13131 1726867186.91265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 13131 1726867186.91333: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867186.91343: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8068170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80662d0> <<< 13131 1726867186.91365: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13131 1726867186.91447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867186.91471: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 13131 1726867186.91514: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806b950> <<< 13131 1726867186.91641: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8068320> <<< 13131 1726867186.91703: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c806ca10> <<< 13131 1726867186.91729: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c806cb60> <<< 13131 1726867186.91805: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c806cb30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80642f0> <<< 13131 1726867186.91836: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 13131 1726867186.91860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 13131 1726867186.91897: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867186.91930: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c80f8350> <<< 13131 1726867186.92071: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867186.92087: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c80f9ac0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806eae0> <<< 13131 1726867186.92128: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c806fe60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806e720> <<< 13131 1726867186.92164: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 13131 1726867186.92183: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.92287: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.92396: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.92420: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 13131 1726867186.92603: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.92657: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.93263: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.93741: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 13131 1726867186.93785: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 13131 1726867186.93806: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867186.93856: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 13131 1726867186.93880: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c80fdbe0> <<< 13131 1726867186.93936: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 13131 1726867186.93963: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80fe840> <<< 13131 1726867186.93988: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806e450> <<< 13131 1726867186.94017: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 13131 1726867186.94047: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.94053: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.94074: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 13131 1726867186.94221: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.94398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 13131 1726867186.94401: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80fe540> # zipimport: zlib available <<< 13131 1726867186.94855: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.95383: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.95417: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.95440: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 13131 1726867186.95459: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.95517: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.95630: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 13131 1726867186.96000: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96018: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 13131 1726867186.96035: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13131 1726867186.96401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80ffaa0> <<< 13131 1726867186.96426: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96481: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96565: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 13131 1726867186.96595: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 13131 1726867186.96638: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96678: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 13131 1726867186.96705: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96726: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96781: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96829: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.96904: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13131 1726867186.96936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13131 1726867186.97020: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c7f0a630> <<< 13131 1726867186.97061: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c7f06ff0> <<< 13131 1726867186.97126: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 13131 1726867186.97143: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.97167: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.97227: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.97257: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.97319: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 13131 1726867186.97356: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 13131 1726867186.97394: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13131 1726867186.97532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 13131 1726867186.97559: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81fec60> <<< 13131 1726867186.97652: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81ee990> <<< 13131 1726867186.97666: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806dca0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80fd430> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 13131 1726867186.97692: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.97766: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 13131 1726867186.97797: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 13131 1726867186.97880: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 13131 1726867186.97953: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.98147: stdout chunk (state=3): >>># zipimport: zlib available <<< 13131 1726867186.98247: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 13131 1726867186.98297: stdout chunk (state=3): >>># destroy __main__ <<< 13131 1726867186.98662: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 13131 1726867186.98701: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token <<< 13131 1726867186.98768: stdout chunk (state=3): >>># cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast <<< 13131 1726867186.98876: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 13131 1726867186.99110: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 13131 1726867186.99115: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile <<< 13131 1726867186.99216: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 13131 1726867186.99252: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 13131 1726867186.99309: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 13131 1726867186.99457: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 13131 1726867186.99469: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 13131 1726867186.99714: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13131 1726867186.99742: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 13131 1726867186.99844: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 13131 1726867186.99868: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 13131 1726867186.99901: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 13131 1726867186.99942: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 13131 1726867187.00479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867187.00482: stdout chunk (state=3): >>><<< 13131 1726867187.00485: stderr chunk (state=3): >>><<< 13131 1726867187.00717: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8bbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8b8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8bbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8bcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8bcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89abe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89abf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89e3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89e3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89c3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89c1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89a9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a03800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a02420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89c2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a00b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a38860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89a82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a38d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a38bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a38f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c89a6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a39610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a392e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a3a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a50710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a51df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a52c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a521e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8a53d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a534a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a3a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c87e3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c880c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c880c590> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c880d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c880d9d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c87e1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880ede0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880db20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8a3ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c883b140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c885b4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c88bc200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c88be960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c88bc320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8881250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81252e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c885a2d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c880fd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff7c885a630> # zipimport: found 30 names in '/tmp/ansible_stat_payload_2_jijmxv/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c817af90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8159e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8158fe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8178ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c81aa960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81aa6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81aa000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81aa540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c817ba10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c81ab650> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c81ab890> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81abda0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c800db80> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c800f7a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8010170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8011310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8013dd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8013ef0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8012090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c801be00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c801a8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c801a630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c801aba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80125a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8063a40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8064140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8065be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80659a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c8068170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80662d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806b950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c8068320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c806ca10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c806cb60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c806cb30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80642f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c80f8350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c80f9ac0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806eae0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c806fe60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806e720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c80fdbe0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80fe840> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806e450> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80fe540> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80ffaa0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff7c7f0a630> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c7f06ff0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81fec60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c81ee990> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c806dca0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff7c80fd430> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 13131 1726867187.01912: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867187.01915: _low_level_execute_command(): starting 13131 1726867187.01918: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867186.590323-13282-162022357317731/ > /dev/null 2>&1 && sleep 0' 13131 1726867187.02166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867187.02169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867187.02172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.02401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867187.02488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867187.04329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867187.04360: stderr chunk (state=3): >>><<< 13131 1726867187.04363: stdout chunk (state=3): >>><<< 13131 1726867187.04641: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867187.04644: handler run complete 13131 1726867187.04647: attempt loop complete, returning result 13131 1726867187.04648: _execute() done 13131 1726867187.04650: dumping result to json 13131 1726867187.04652: done dumping result, returning 13131 1726867187.04654: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affcac9-a3a5-5f24-9b7a-0000000001cf] 13131 1726867187.04655: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001cf 13131 1726867187.04722: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001cf 13131 1726867187.04725: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 13131 1726867187.04863: no more pending results, returning what we have 13131 1726867187.04866: results queue empty 13131 1726867187.04867: checking for any_errors_fatal 13131 1726867187.04873: done checking for any_errors_fatal 13131 1726867187.04874: checking for max_fail_percentage 13131 1726867187.04876: done checking for max_fail_percentage 13131 1726867187.04876: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.04879: done checking to see if all hosts have failed 13131 1726867187.04879: getting the remaining hosts for this loop 13131 1726867187.04881: done getting the remaining hosts for this loop 13131 1726867187.04884: getting the next task for host managed_node1 13131 1726867187.04890: done getting next task for host managed_node1 13131 1726867187.04893: ^ task is: TASK: Set flag to indicate system is ostree 13131 1726867187.04896: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.04900: getting variables 13131 1726867187.04901: in VariableManager get_vars() 13131 1726867187.04932: Calling all_inventory to load vars for managed_node1 13131 1726867187.04935: Calling groups_inventory to load vars for managed_node1 13131 1726867187.04938: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.04950: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.04953: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.04956: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.05525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.06055: done with get_vars() 13131 1726867187.06065: done getting variables 13131 1726867187.06269: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:19:47 -0400 (0:00:00.524) 0:00:02.174 ****** 13131 1726867187.06393: entering _queue_task() for managed_node1/set_fact 13131 1726867187.06395: Creating lock for set_fact 13131 1726867187.07017: worker is 1 (out of 1 available) 13131 1726867187.07028: exiting _queue_task() for managed_node1/set_fact 13131 1726867187.07040: done queuing things up, now waiting for results queue to drain 13131 1726867187.07041: waiting for pending results... 13131 1726867187.07599: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 13131 1726867187.07607: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001d0 13131 1726867187.07616: variable 'ansible_search_path' from source: unknown 13131 1726867187.07619: variable 'ansible_search_path' from source: unknown 13131 1726867187.07712: calling self._execute() 13131 1726867187.07927: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.07932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.07941: variable 'omit' from source: magic vars 13131 1726867187.09186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867187.09821: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867187.10027: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867187.10483: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867187.10486: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867187.10492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867187.10495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867187.10498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867187.11082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867187.11085: Evaluated conditional (not __network_is_ostree is defined): True 13131 1726867187.11088: variable 'omit' from source: magic vars 13131 1726867187.11092: variable 'omit' from source: magic vars 13131 1726867187.11366: variable '__ostree_booted_stat' from source: set_fact 13131 1726867187.11421: variable 'omit' from source: magic vars 13131 1726867187.11450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867187.11882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867187.11886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867187.11888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867187.11893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867187.11895: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867187.11898: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.11900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.11902: Set connection var ansible_connection to ssh 13131 1726867187.11911: Set connection var ansible_timeout to 10 13131 1726867187.11917: Set connection var ansible_shell_type to sh 13131 1726867187.11929: Set connection var ansible_shell_executable to /bin/sh 13131 1726867187.12283: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867187.12286: Set connection var ansible_pipelining to False 13131 1726867187.12289: variable 'ansible_shell_executable' from source: unknown 13131 1726867187.12293: variable 'ansible_connection' from source: unknown 13131 1726867187.12295: variable 'ansible_module_compression' from source: unknown 13131 1726867187.12297: variable 'ansible_shell_type' from source: unknown 13131 1726867187.12298: variable 'ansible_shell_executable' from source: unknown 13131 1726867187.12300: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.12302: variable 'ansible_pipelining' from source: unknown 13131 1726867187.12304: variable 'ansible_timeout' from source: unknown 13131 1726867187.12305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.12308: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867187.12310: variable 'omit' from source: magic vars 13131 1726867187.12311: starting attempt loop 13131 1726867187.12313: running the handler 13131 1726867187.12315: handler run complete 13131 1726867187.12893: attempt loop complete, returning result 13131 1726867187.12896: _execute() done 13131 1726867187.12899: dumping result to json 13131 1726867187.12901: done dumping result, returning 13131 1726867187.12904: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-5f24-9b7a-0000000001d0] 13131 1726867187.12906: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001d0 13131 1726867187.12971: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001d0 13131 1726867187.12975: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 13131 1726867187.13027: no more pending results, returning what we have 13131 1726867187.13036: results queue empty 13131 1726867187.13037: checking for any_errors_fatal 13131 1726867187.13041: done checking for any_errors_fatal 13131 1726867187.13041: checking for max_fail_percentage 13131 1726867187.13043: done checking for max_fail_percentage 13131 1726867187.13043: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.13044: done checking to see if all hosts have failed 13131 1726867187.13045: getting the remaining hosts for this loop 13131 1726867187.13046: done getting the remaining hosts for this loop 13131 1726867187.13049: getting the next task for host managed_node1 13131 1726867187.13055: done getting next task for host managed_node1 13131 1726867187.13058: ^ task is: TASK: Fix CentOS6 Base repo 13131 1726867187.13060: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.13063: getting variables 13131 1726867187.13064: in VariableManager get_vars() 13131 1726867187.13090: Calling all_inventory to load vars for managed_node1 13131 1726867187.13093: Calling groups_inventory to load vars for managed_node1 13131 1726867187.13096: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.13104: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.13107: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.13114: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.13760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.13998: done with get_vars() 13131 1726867187.14011: done getting variables 13131 1726867187.14133: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:19:47 -0400 (0:00:00.077) 0:00:02.252 ****** 13131 1726867187.14160: entering _queue_task() for managed_node1/copy 13131 1726867187.14415: worker is 1 (out of 1 available) 13131 1726867187.14427: exiting _queue_task() for managed_node1/copy 13131 1726867187.14438: done queuing things up, now waiting for results queue to drain 13131 1726867187.14439: waiting for pending results... 13131 1726867187.14701: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 13131 1726867187.14814: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001d2 13131 1726867187.14833: variable 'ansible_search_path' from source: unknown 13131 1726867187.14841: variable 'ansible_search_path' from source: unknown 13131 1726867187.14880: calling self._execute() 13131 1726867187.14960: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.14971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.14988: variable 'omit' from source: magic vars 13131 1726867187.15645: variable 'ansible_distribution' from source: facts 13131 1726867187.15765: Evaluated conditional (ansible_distribution == 'CentOS'): True 13131 1726867187.16022: variable 'ansible_distribution_major_version' from source: facts 13131 1726867187.16025: Evaluated conditional (ansible_distribution_major_version == '6'): False 13131 1726867187.16130: when evaluation is False, skipping this task 13131 1726867187.16133: _execute() done 13131 1726867187.16136: dumping result to json 13131 1726867187.16138: done dumping result, returning 13131 1726867187.16140: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-5f24-9b7a-0000000001d2] 13131 1726867187.16142: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001d2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13131 1726867187.16364: no more pending results, returning what we have 13131 1726867187.16368: results queue empty 13131 1726867187.16369: checking for any_errors_fatal 13131 1726867187.16375: done checking for any_errors_fatal 13131 1726867187.16376: checking for max_fail_percentage 13131 1726867187.16380: done checking for max_fail_percentage 13131 1726867187.16381: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.16381: done checking to see if all hosts have failed 13131 1726867187.16382: getting the remaining hosts for this loop 13131 1726867187.16383: done getting the remaining hosts for this loop 13131 1726867187.16387: getting the next task for host managed_node1 13131 1726867187.16395: done getting next task for host managed_node1 13131 1726867187.16397: ^ task is: TASK: Include the task 'enable_epel.yml' 13131 1726867187.16401: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.16406: getting variables 13131 1726867187.16407: in VariableManager get_vars() 13131 1726867187.16443: Calling all_inventory to load vars for managed_node1 13131 1726867187.16446: Calling groups_inventory to load vars for managed_node1 13131 1726867187.16450: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.16463: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.16466: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.16469: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.17112: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001d2 13131 1726867187.17116: WORKER PROCESS EXITING 13131 1726867187.17249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.17523: done with get_vars() 13131 1726867187.17532: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:19:47 -0400 (0:00:00.035) 0:00:02.287 ****** 13131 1726867187.17675: entering _queue_task() for managed_node1/include_tasks 13131 1726867187.18258: worker is 1 (out of 1 available) 13131 1726867187.18268: exiting _queue_task() for managed_node1/include_tasks 13131 1726867187.18281: done queuing things up, now waiting for results queue to drain 13131 1726867187.18283: waiting for pending results... 13131 1726867187.18443: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 13131 1726867187.18553: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001d3 13131 1726867187.18576: variable 'ansible_search_path' from source: unknown 13131 1726867187.18586: variable 'ansible_search_path' from source: unknown 13131 1726867187.18631: calling self._execute() 13131 1726867187.18711: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.18730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.18787: variable 'omit' from source: magic vars 13131 1726867187.19356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867187.21938: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867187.22018: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867187.22069: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867187.22130: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867187.22184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867187.22263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867187.22343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867187.22346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867187.22384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867187.22404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867187.22532: variable '__network_is_ostree' from source: set_fact 13131 1726867187.22560: Evaluated conditional (not __network_is_ostree | d(false)): True 13131 1726867187.22575: _execute() done 13131 1726867187.22665: dumping result to json 13131 1726867187.22668: done dumping result, returning 13131 1726867187.22670: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-5f24-9b7a-0000000001d3] 13131 1726867187.22672: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001d3 13131 1726867187.22743: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001d3 13131 1726867187.22745: WORKER PROCESS EXITING 13131 1726867187.22783: no more pending results, returning what we have 13131 1726867187.22788: in VariableManager get_vars() 13131 1726867187.22826: Calling all_inventory to load vars for managed_node1 13131 1726867187.22829: Calling groups_inventory to load vars for managed_node1 13131 1726867187.22833: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.22846: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.22850: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.22853: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.23741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.24070: done with get_vars() 13131 1726867187.24080: variable 'ansible_search_path' from source: unknown 13131 1726867187.24081: variable 'ansible_search_path' from source: unknown 13131 1726867187.24118: we have included files to process 13131 1726867187.24119: generating all_blocks data 13131 1726867187.24120: done generating all_blocks data 13131 1726867187.24126: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13131 1726867187.24127: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13131 1726867187.24129: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13131 1726867187.25367: done processing included file 13131 1726867187.25370: iterating over new_blocks loaded from include file 13131 1726867187.25371: in VariableManager get_vars() 13131 1726867187.25393: done with get_vars() 13131 1726867187.25395: filtering new block on tags 13131 1726867187.25417: done filtering new block on tags 13131 1726867187.25419: in VariableManager get_vars() 13131 1726867187.25430: done with get_vars() 13131 1726867187.25432: filtering new block on tags 13131 1726867187.25443: done filtering new block on tags 13131 1726867187.25445: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 13131 1726867187.25450: extending task lists for all hosts with included blocks 13131 1726867187.25553: done extending task lists 13131 1726867187.25554: done processing included files 13131 1726867187.25555: results queue empty 13131 1726867187.25556: checking for any_errors_fatal 13131 1726867187.25558: done checking for any_errors_fatal 13131 1726867187.25559: checking for max_fail_percentage 13131 1726867187.25560: done checking for max_fail_percentage 13131 1726867187.25561: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.25562: done checking to see if all hosts have failed 13131 1726867187.25562: getting the remaining hosts for this loop 13131 1726867187.25563: done getting the remaining hosts for this loop 13131 1726867187.25565: getting the next task for host managed_node1 13131 1726867187.25569: done getting next task for host managed_node1 13131 1726867187.25571: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 13131 1726867187.25573: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.25575: getting variables 13131 1726867187.25576: in VariableManager get_vars() 13131 1726867187.25587: Calling all_inventory to load vars for managed_node1 13131 1726867187.25589: Calling groups_inventory to load vars for managed_node1 13131 1726867187.25591: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.25602: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.25612: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.25616: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.25781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.26048: done with get_vars() 13131 1726867187.26057: done getting variables 13131 1726867187.26121: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 13131 1726867187.26326: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:19:47 -0400 (0:00:00.087) 0:00:02.374 ****** 13131 1726867187.26397: entering _queue_task() for managed_node1/command 13131 1726867187.26399: Creating lock for command 13131 1726867187.26710: worker is 1 (out of 1 available) 13131 1726867187.26721: exiting _queue_task() for managed_node1/command 13131 1726867187.26733: done queuing things up, now waiting for results queue to drain 13131 1726867187.26735: waiting for pending results... 13131 1726867187.27142: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 13131 1726867187.27146: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001ed 13131 1726867187.27149: variable 'ansible_search_path' from source: unknown 13131 1726867187.27151: variable 'ansible_search_path' from source: unknown 13131 1726867187.27183: calling self._execute() 13131 1726867187.27283: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.27287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.27300: variable 'omit' from source: magic vars 13131 1726867187.27737: variable 'ansible_distribution' from source: facts 13131 1726867187.27740: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13131 1726867187.27856: variable 'ansible_distribution_major_version' from source: facts 13131 1726867187.27866: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13131 1726867187.27873: when evaluation is False, skipping this task 13131 1726867187.27881: _execute() done 13131 1726867187.27895: dumping result to json 13131 1726867187.27954: done dumping result, returning 13131 1726867187.27958: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0affcac9-a3a5-5f24-9b7a-0000000001ed] 13131 1726867187.27960: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001ed 13131 1726867187.28327: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001ed 13131 1726867187.28331: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13131 1726867187.28416: no more pending results, returning what we have 13131 1726867187.28420: results queue empty 13131 1726867187.28421: checking for any_errors_fatal 13131 1726867187.28422: done checking for any_errors_fatal 13131 1726867187.28423: checking for max_fail_percentage 13131 1726867187.28425: done checking for max_fail_percentage 13131 1726867187.28425: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.28426: done checking to see if all hosts have failed 13131 1726867187.28427: getting the remaining hosts for this loop 13131 1726867187.28428: done getting the remaining hosts for this loop 13131 1726867187.28484: getting the next task for host managed_node1 13131 1726867187.28497: done getting next task for host managed_node1 13131 1726867187.28500: ^ task is: TASK: Install yum-utils package 13131 1726867187.28504: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.28508: getting variables 13131 1726867187.28509: in VariableManager get_vars() 13131 1726867187.28583: Calling all_inventory to load vars for managed_node1 13131 1726867187.28587: Calling groups_inventory to load vars for managed_node1 13131 1726867187.28590: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.28606: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.28610: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.28613: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.28873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.29110: done with get_vars() 13131 1726867187.29119: done getting variables 13131 1726867187.29215: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:19:47 -0400 (0:00:00.028) 0:00:02.403 ****** 13131 1726867187.29249: entering _queue_task() for managed_node1/package 13131 1726867187.29251: Creating lock for package 13131 1726867187.29609: worker is 1 (out of 1 available) 13131 1726867187.29618: exiting _queue_task() for managed_node1/package 13131 1726867187.29628: done queuing things up, now waiting for results queue to drain 13131 1726867187.29629: waiting for pending results... 13131 1726867187.29882: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 13131 1726867187.29980: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001ee 13131 1726867187.30014: variable 'ansible_search_path' from source: unknown 13131 1726867187.30017: variable 'ansible_search_path' from source: unknown 13131 1726867187.30048: calling self._execute() 13131 1726867187.30232: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.30235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.30238: variable 'omit' from source: magic vars 13131 1726867187.30519: variable 'ansible_distribution' from source: facts 13131 1726867187.30533: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13131 1726867187.30665: variable 'ansible_distribution_major_version' from source: facts 13131 1726867187.30672: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13131 1726867187.30683: when evaluation is False, skipping this task 13131 1726867187.30692: _execute() done 13131 1726867187.30828: dumping result to json 13131 1726867187.30831: done dumping result, returning 13131 1726867187.30834: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affcac9-a3a5-5f24-9b7a-0000000001ee] 13131 1726867187.30836: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001ee skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13131 1726867187.30939: no more pending results, returning what we have 13131 1726867187.30942: results queue empty 13131 1726867187.30943: checking for any_errors_fatal 13131 1726867187.30948: done checking for any_errors_fatal 13131 1726867187.30949: checking for max_fail_percentage 13131 1726867187.30950: done checking for max_fail_percentage 13131 1726867187.30950: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.30951: done checking to see if all hosts have failed 13131 1726867187.30952: getting the remaining hosts for this loop 13131 1726867187.30953: done getting the remaining hosts for this loop 13131 1726867187.30955: getting the next task for host managed_node1 13131 1726867187.30960: done getting next task for host managed_node1 13131 1726867187.30962: ^ task is: TASK: Enable EPEL 7 13131 1726867187.30965: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.30968: getting variables 13131 1726867187.30969: in VariableManager get_vars() 13131 1726867187.31034: Calling all_inventory to load vars for managed_node1 13131 1726867187.31037: Calling groups_inventory to load vars for managed_node1 13131 1726867187.31040: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.31050: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.31052: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.31056: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.31222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.31422: done with get_vars() 13131 1726867187.31432: done getting variables 13131 1726867187.31484: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001ee 13131 1726867187.31487: WORKER PROCESS EXITING 13131 1726867187.31503: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:19:47 -0400 (0:00:00.022) 0:00:02.425 ****** 13131 1726867187.31529: entering _queue_task() for managed_node1/command 13131 1726867187.31781: worker is 1 (out of 1 available) 13131 1726867187.31792: exiting _queue_task() for managed_node1/command 13131 1726867187.31804: done queuing things up, now waiting for results queue to drain 13131 1726867187.31805: waiting for pending results... 13131 1726867187.31981: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 13131 1726867187.32128: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001ef 13131 1726867187.32133: variable 'ansible_search_path' from source: unknown 13131 1726867187.32136: variable 'ansible_search_path' from source: unknown 13131 1726867187.32157: calling self._execute() 13131 1726867187.32234: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.32247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.32262: variable 'omit' from source: magic vars 13131 1726867187.32668: variable 'ansible_distribution' from source: facts 13131 1726867187.32699: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13131 1726867187.32883: variable 'ansible_distribution_major_version' from source: facts 13131 1726867187.32886: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13131 1726867187.32889: when evaluation is False, skipping this task 13131 1726867187.32894: _execute() done 13131 1726867187.32896: dumping result to json 13131 1726867187.32898: done dumping result, returning 13131 1726867187.32901: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affcac9-a3a5-5f24-9b7a-0000000001ef] 13131 1726867187.32902: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001ef skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13131 1726867187.33011: no more pending results, returning what we have 13131 1726867187.33014: results queue empty 13131 1726867187.33015: checking for any_errors_fatal 13131 1726867187.33020: done checking for any_errors_fatal 13131 1726867187.33021: checking for max_fail_percentage 13131 1726867187.33023: done checking for max_fail_percentage 13131 1726867187.33023: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.33024: done checking to see if all hosts have failed 13131 1726867187.33024: getting the remaining hosts for this loop 13131 1726867187.33026: done getting the remaining hosts for this loop 13131 1726867187.33028: getting the next task for host managed_node1 13131 1726867187.33034: done getting next task for host managed_node1 13131 1726867187.33036: ^ task is: TASK: Enable EPEL 8 13131 1726867187.33040: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.33042: getting variables 13131 1726867187.33044: in VariableManager get_vars() 13131 1726867187.33068: Calling all_inventory to load vars for managed_node1 13131 1726867187.33070: Calling groups_inventory to load vars for managed_node1 13131 1726867187.33073: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.33084: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.33086: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.33089: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.33285: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001ef 13131 1726867187.33289: WORKER PROCESS EXITING 13131 1726867187.33312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.33516: done with get_vars() 13131 1726867187.33524: done getting variables 13131 1726867187.33587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:19:47 -0400 (0:00:00.020) 0:00:02.446 ****** 13131 1726867187.33614: entering _queue_task() for managed_node1/command 13131 1726867187.33842: worker is 1 (out of 1 available) 13131 1726867187.33854: exiting _queue_task() for managed_node1/command 13131 1726867187.33865: done queuing things up, now waiting for results queue to drain 13131 1726867187.33866: waiting for pending results... 13131 1726867187.34132: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 13131 1726867187.34249: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001f0 13131 1726867187.34266: variable 'ansible_search_path' from source: unknown 13131 1726867187.34273: variable 'ansible_search_path' from source: unknown 13131 1726867187.34317: calling self._execute() 13131 1726867187.34402: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.34415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.34431: variable 'omit' from source: magic vars 13131 1726867187.34712: variable 'ansible_distribution' from source: facts 13131 1726867187.34721: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13131 1726867187.34808: variable 'ansible_distribution_major_version' from source: facts 13131 1726867187.34812: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13131 1726867187.34814: when evaluation is False, skipping this task 13131 1726867187.34817: _execute() done 13131 1726867187.34821: dumping result to json 13131 1726867187.34823: done dumping result, returning 13131 1726867187.34830: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affcac9-a3a5-5f24-9b7a-0000000001f0] 13131 1726867187.34834: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001f0 13131 1726867187.34916: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001f0 13131 1726867187.34919: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13131 1726867187.34961: no more pending results, returning what we have 13131 1726867187.34964: results queue empty 13131 1726867187.34965: checking for any_errors_fatal 13131 1726867187.34969: done checking for any_errors_fatal 13131 1726867187.34970: checking for max_fail_percentage 13131 1726867187.34972: done checking for max_fail_percentage 13131 1726867187.34972: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.34973: done checking to see if all hosts have failed 13131 1726867187.34973: getting the remaining hosts for this loop 13131 1726867187.34974: done getting the remaining hosts for this loop 13131 1726867187.34979: getting the next task for host managed_node1 13131 1726867187.34985: done getting next task for host managed_node1 13131 1726867187.34987: ^ task is: TASK: Enable EPEL 6 13131 1726867187.34990: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.34992: getting variables 13131 1726867187.34994: in VariableManager get_vars() 13131 1726867187.35015: Calling all_inventory to load vars for managed_node1 13131 1726867187.35017: Calling groups_inventory to load vars for managed_node1 13131 1726867187.35020: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.35030: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.35032: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.35035: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.35147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.35259: done with get_vars() 13131 1726867187.35265: done getting variables 13131 1726867187.35306: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:19:47 -0400 (0:00:00.017) 0:00:02.463 ****** 13131 1726867187.35324: entering _queue_task() for managed_node1/copy 13131 1726867187.35484: worker is 1 (out of 1 available) 13131 1726867187.35495: exiting _queue_task() for managed_node1/copy 13131 1726867187.35511: done queuing things up, now waiting for results queue to drain 13131 1726867187.35512: waiting for pending results... 13131 1726867187.35651: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 13131 1726867187.35714: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001f2 13131 1726867187.35723: variable 'ansible_search_path' from source: unknown 13131 1726867187.35727: variable 'ansible_search_path' from source: unknown 13131 1726867187.35754: calling self._execute() 13131 1726867187.35809: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.35813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.35820: variable 'omit' from source: magic vars 13131 1726867187.36111: variable 'ansible_distribution' from source: facts 13131 1726867187.36119: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13131 1726867187.36193: variable 'ansible_distribution_major_version' from source: facts 13131 1726867187.36202: Evaluated conditional (ansible_distribution_major_version == '6'): False 13131 1726867187.36205: when evaluation is False, skipping this task 13131 1726867187.36208: _execute() done 13131 1726867187.36211: dumping result to json 13131 1726867187.36213: done dumping result, returning 13131 1726867187.36218: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affcac9-a3a5-5f24-9b7a-0000000001f2] 13131 1726867187.36223: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001f2 13131 1726867187.36305: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001f2 13131 1726867187.36309: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13131 1726867187.36349: no more pending results, returning what we have 13131 1726867187.36352: results queue empty 13131 1726867187.36353: checking for any_errors_fatal 13131 1726867187.36356: done checking for any_errors_fatal 13131 1726867187.36356: checking for max_fail_percentage 13131 1726867187.36358: done checking for max_fail_percentage 13131 1726867187.36358: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.36359: done checking to see if all hosts have failed 13131 1726867187.36360: getting the remaining hosts for this loop 13131 1726867187.36360: done getting the remaining hosts for this loop 13131 1726867187.36363: getting the next task for host managed_node1 13131 1726867187.36370: done getting next task for host managed_node1 13131 1726867187.36372: ^ task is: TASK: Set network provider to 'nm' 13131 1726867187.36374: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.36379: getting variables 13131 1726867187.36380: in VariableManager get_vars() 13131 1726867187.36401: Calling all_inventory to load vars for managed_node1 13131 1726867187.36404: Calling groups_inventory to load vars for managed_node1 13131 1726867187.36406: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.36416: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.36420: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.36422: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.36546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.36669: done with get_vars() 13131 1726867187.36675: done getting variables 13131 1726867187.36713: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:13 Friday 20 September 2024 17:19:47 -0400 (0:00:00.014) 0:00:02.477 ****** 13131 1726867187.36729: entering _queue_task() for managed_node1/set_fact 13131 1726867187.36886: worker is 1 (out of 1 available) 13131 1726867187.36897: exiting _queue_task() for managed_node1/set_fact 13131 1726867187.36908: done queuing things up, now waiting for results queue to drain 13131 1726867187.36909: waiting for pending results... 13131 1726867187.37239: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 13131 1726867187.37245: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000007 13131 1726867187.37247: variable 'ansible_search_path' from source: unknown 13131 1726867187.37250: calling self._execute() 13131 1726867187.37252: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.37255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.37257: variable 'omit' from source: magic vars 13131 1726867187.37320: variable 'omit' from source: magic vars 13131 1726867187.37346: variable 'omit' from source: magic vars 13131 1726867187.37370: variable 'omit' from source: magic vars 13131 1726867187.37403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867187.37430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867187.37446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867187.37463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867187.37472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867187.37498: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867187.37502: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.37505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.37571: Set connection var ansible_connection to ssh 13131 1726867187.37579: Set connection var ansible_timeout to 10 13131 1726867187.37583: Set connection var ansible_shell_type to sh 13131 1726867187.37590: Set connection var ansible_shell_executable to /bin/sh 13131 1726867187.37599: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867187.37604: Set connection var ansible_pipelining to False 13131 1726867187.37619: variable 'ansible_shell_executable' from source: unknown 13131 1726867187.37624: variable 'ansible_connection' from source: unknown 13131 1726867187.37627: variable 'ansible_module_compression' from source: unknown 13131 1726867187.37629: variable 'ansible_shell_type' from source: unknown 13131 1726867187.37632: variable 'ansible_shell_executable' from source: unknown 13131 1726867187.37634: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.37639: variable 'ansible_pipelining' from source: unknown 13131 1726867187.37642: variable 'ansible_timeout' from source: unknown 13131 1726867187.37646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.37749: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867187.37757: variable 'omit' from source: magic vars 13131 1726867187.37762: starting attempt loop 13131 1726867187.37765: running the handler 13131 1726867187.37774: handler run complete 13131 1726867187.37788: attempt loop complete, returning result 13131 1726867187.37793: _execute() done 13131 1726867187.37795: dumping result to json 13131 1726867187.37798: done dumping result, returning 13131 1726867187.37800: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affcac9-a3a5-5f24-9b7a-000000000007] 13131 1726867187.37803: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000007 13131 1726867187.37873: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000007 13131 1726867187.37876: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 13131 1726867187.37924: no more pending results, returning what we have 13131 1726867187.37926: results queue empty 13131 1726867187.37927: checking for any_errors_fatal 13131 1726867187.37931: done checking for any_errors_fatal 13131 1726867187.37932: checking for max_fail_percentage 13131 1726867187.37933: done checking for max_fail_percentage 13131 1726867187.37934: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.37934: done checking to see if all hosts have failed 13131 1726867187.37935: getting the remaining hosts for this loop 13131 1726867187.37936: done getting the remaining hosts for this loop 13131 1726867187.37938: getting the next task for host managed_node1 13131 1726867187.37943: done getting next task for host managed_node1 13131 1726867187.37944: ^ task is: TASK: meta (flush_handlers) 13131 1726867187.37946: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.37950: getting variables 13131 1726867187.37951: in VariableManager get_vars() 13131 1726867187.37971: Calling all_inventory to load vars for managed_node1 13131 1726867187.37974: Calling groups_inventory to load vars for managed_node1 13131 1726867187.37976: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.37986: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.37989: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.37994: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.38124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.38296: done with get_vars() 13131 1726867187.38303: done getting variables 13131 1726867187.38354: in VariableManager get_vars() 13131 1726867187.38361: Calling all_inventory to load vars for managed_node1 13131 1726867187.38363: Calling groups_inventory to load vars for managed_node1 13131 1726867187.38365: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.38368: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.38370: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.38372: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.38539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.38699: done with get_vars() 13131 1726867187.38708: done queuing things up, now waiting for results queue to drain 13131 1726867187.38709: results queue empty 13131 1726867187.38710: checking for any_errors_fatal 13131 1726867187.38711: done checking for any_errors_fatal 13131 1726867187.38712: checking for max_fail_percentage 13131 1726867187.38713: done checking for max_fail_percentage 13131 1726867187.38713: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.38714: done checking to see if all hosts have failed 13131 1726867187.38715: getting the remaining hosts for this loop 13131 1726867187.38716: done getting the remaining hosts for this loop 13131 1726867187.38718: getting the next task for host managed_node1 13131 1726867187.38720: done getting next task for host managed_node1 13131 1726867187.38721: ^ task is: TASK: meta (flush_handlers) 13131 1726867187.38722: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.38727: getting variables 13131 1726867187.38727: in VariableManager get_vars() 13131 1726867187.38732: Calling all_inventory to load vars for managed_node1 13131 1726867187.38734: Calling groups_inventory to load vars for managed_node1 13131 1726867187.38735: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.38738: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.38739: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.38741: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.38818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.38923: done with get_vars() 13131 1726867187.38928: done getting variables 13131 1726867187.38956: in VariableManager get_vars() 13131 1726867187.38962: Calling all_inventory to load vars for managed_node1 13131 1726867187.38963: Calling groups_inventory to load vars for managed_node1 13131 1726867187.38965: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.38967: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.38969: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.38970: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.39064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.39167: done with get_vars() 13131 1726867187.39175: done queuing things up, now waiting for results queue to drain 13131 1726867187.39176: results queue empty 13131 1726867187.39178: checking for any_errors_fatal 13131 1726867187.39179: done checking for any_errors_fatal 13131 1726867187.39180: checking for max_fail_percentage 13131 1726867187.39180: done checking for max_fail_percentage 13131 1726867187.39181: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.39181: done checking to see if all hosts have failed 13131 1726867187.39181: getting the remaining hosts for this loop 13131 1726867187.39182: done getting the remaining hosts for this loop 13131 1726867187.39183: getting the next task for host managed_node1 13131 1726867187.39185: done getting next task for host managed_node1 13131 1726867187.39185: ^ task is: None 13131 1726867187.39186: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.39187: done queuing things up, now waiting for results queue to drain 13131 1726867187.39187: results queue empty 13131 1726867187.39188: checking for any_errors_fatal 13131 1726867187.39188: done checking for any_errors_fatal 13131 1726867187.39189: checking for max_fail_percentage 13131 1726867187.39189: done checking for max_fail_percentage 13131 1726867187.39190: checking to see if all hosts have failed and the running result is not ok 13131 1726867187.39192: done checking to see if all hosts have failed 13131 1726867187.39193: getting the next task for host managed_node1 13131 1726867187.39194: done getting next task for host managed_node1 13131 1726867187.39195: ^ task is: None 13131 1726867187.39196: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.39229: in VariableManager get_vars() 13131 1726867187.39253: done with get_vars() 13131 1726867187.39257: in VariableManager get_vars() 13131 1726867187.39271: done with get_vars() 13131 1726867187.39274: variable 'omit' from source: magic vars 13131 1726867187.39297: in VariableManager get_vars() 13131 1726867187.39311: done with get_vars() 13131 1726867187.39323: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 13131 1726867187.39967: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 13131 1726867187.39988: getting the remaining hosts for this loop 13131 1726867187.39989: done getting the remaining hosts for this loop 13131 1726867187.39993: getting the next task for host managed_node1 13131 1726867187.39994: done getting next task for host managed_node1 13131 1726867187.39995: ^ task is: TASK: Gathering Facts 13131 1726867187.39996: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867187.39998: getting variables 13131 1726867187.39998: in VariableManager get_vars() 13131 1726867187.40009: Calling all_inventory to load vars for managed_node1 13131 1726867187.40010: Calling groups_inventory to load vars for managed_node1 13131 1726867187.40011: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867187.40015: Calling all_plugins_play to load vars for managed_node1 13131 1726867187.40024: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867187.40027: Calling groups_plugins_play to load vars for managed_node1 13131 1726867187.40106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867187.40215: done with get_vars() 13131 1726867187.40220: done getting variables 13131 1726867187.40246: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Friday 20 September 2024 17:19:47 -0400 (0:00:00.035) 0:00:02.513 ****** 13131 1726867187.40260: entering _queue_task() for managed_node1/gather_facts 13131 1726867187.40417: worker is 1 (out of 1 available) 13131 1726867187.40428: exiting _queue_task() for managed_node1/gather_facts 13131 1726867187.40439: done queuing things up, now waiting for results queue to drain 13131 1726867187.40440: waiting for pending results... 13131 1726867187.40569: running TaskExecutor() for managed_node1/TASK: Gathering Facts 13131 1726867187.40627: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000218 13131 1726867187.40637: variable 'ansible_search_path' from source: unknown 13131 1726867187.40665: calling self._execute() 13131 1726867187.40727: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.40731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.40740: variable 'omit' from source: magic vars 13131 1726867187.40988: variable 'ansible_distribution_major_version' from source: facts 13131 1726867187.40997: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867187.41004: variable 'omit' from source: magic vars 13131 1726867187.41021: variable 'omit' from source: magic vars 13131 1726867187.41044: variable 'omit' from source: magic vars 13131 1726867187.41071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867187.41098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867187.41114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867187.41129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867187.41139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867187.41159: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867187.41162: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.41165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.41233: Set connection var ansible_connection to ssh 13131 1726867187.41237: Set connection var ansible_timeout to 10 13131 1726867187.41240: Set connection var ansible_shell_type to sh 13131 1726867187.41246: Set connection var ansible_shell_executable to /bin/sh 13131 1726867187.41253: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867187.41258: Set connection var ansible_pipelining to False 13131 1726867187.41274: variable 'ansible_shell_executable' from source: unknown 13131 1726867187.41279: variable 'ansible_connection' from source: unknown 13131 1726867187.41282: variable 'ansible_module_compression' from source: unknown 13131 1726867187.41285: variable 'ansible_shell_type' from source: unknown 13131 1726867187.41287: variable 'ansible_shell_executable' from source: unknown 13131 1726867187.41289: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867187.41294: variable 'ansible_pipelining' from source: unknown 13131 1726867187.41297: variable 'ansible_timeout' from source: unknown 13131 1726867187.41299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867187.41420: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867187.41428: variable 'omit' from source: magic vars 13131 1726867187.41432: starting attempt loop 13131 1726867187.41435: running the handler 13131 1726867187.41553: variable 'ansible_facts' from source: unknown 13131 1726867187.41556: _low_level_execute_command(): starting 13131 1726867187.41559: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867187.41958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867187.41962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.41964: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867187.41966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.42024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867187.42028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867187.42030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867187.42099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867187.44415: stdout chunk (state=3): >>>/root <<< 13131 1726867187.44560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867187.44589: stderr chunk (state=3): >>><<< 13131 1726867187.44595: stdout chunk (state=3): >>><<< 13131 1726867187.44615: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867187.44624: _low_level_execute_command(): starting 13131 1726867187.44630: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375 `" && echo ansible-tmp-1726867187.4461274-13353-124549036106375="` echo /root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375 `" ) && sleep 0' 13131 1726867187.45057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867187.45060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867187.45062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.45072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867187.45074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.45126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867187.45132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867187.45188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867187.47851: stdout chunk (state=3): >>>ansible-tmp-1726867187.4461274-13353-124549036106375=/root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375 <<< 13131 1726867187.48022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867187.48025: stdout chunk (state=3): >>><<< 13131 1726867187.48031: stderr chunk (state=3): >>><<< 13131 1726867187.48044: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867187.4461274-13353-124549036106375=/root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867187.48071: variable 'ansible_module_compression' from source: unknown 13131 1726867187.48110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13131 1726867187.48153: variable 'ansible_facts' from source: unknown 13131 1726867187.48346: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/AnsiballZ_setup.py 13131 1726867187.48379: Sending initial data 13131 1726867187.48383: Sent initial data (154 bytes) 13131 1726867187.48810: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867187.48813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.48816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867187.48820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.48864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867187.48868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867187.48925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867187.51108: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867187.51111: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867187.51155: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867187.51204: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpkm29tzkh /root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/AnsiballZ_setup.py <<< 13131 1726867187.51207: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/AnsiballZ_setup.py" <<< 13131 1726867187.51254: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpkm29tzkh" to remote "/root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/AnsiballZ_setup.py" <<< 13131 1726867187.52400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867187.52412: stderr chunk (state=3): >>><<< 13131 1726867187.52419: stdout chunk (state=3): >>><<< 13131 1726867187.52523: done transferring module to remote 13131 1726867187.52527: _low_level_execute_command(): starting 13131 1726867187.52529: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/ /root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/AnsiballZ_setup.py && sleep 0' 13131 1726867187.53081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867187.53130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867187.53133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867187.53136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.53138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867187.53142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.53179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867187.53203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867187.53250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867187.55784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867187.55810: stderr chunk (state=3): >>><<< 13131 1726867187.55815: stdout chunk (state=3): >>><<< 13131 1726867187.55829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867187.55832: _low_level_execute_command(): starting 13131 1726867187.55837: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/AnsiballZ_setup.py && sleep 0' 13131 1726867187.56241: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867187.56244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.56247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867187.56249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867187.56251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867187.56302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867187.56308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867187.56361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867188.39049: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.54150390625, "5m": 0.302734375, "15m": 0.1416015625}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "48", "epoch": "1726867188", "epoch_int": "1726867188", "date": "2024-09-20", "time": "17:19:48", "iso8601_micro": "2024-09-20T21:19:48.003037Z", "iso8601": "2024-09-20T21:19:48Z", "iso8601_basic": "20240920T171948003037", "iso8601_basic_short": "20240920T171948", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2,<<< 13131 1726867188.39105: stdout chunk (state=3): >>> "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 433, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796585472, "block_size": 4096, "block_total": 65519099, "block_available": 63915182, "block_used": 1603917, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13131 1726867188.41981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867188.42015: stderr chunk (state=3): >>><<< 13131 1726867188.42017: stdout chunk (state=3): >>><<< 13131 1726867188.42044: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.54150390625, "5m": 0.302734375, "15m": 0.1416015625}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "48", "epoch": "1726867188", "epoch_int": "1726867188", "date": "2024-09-20", "time": "17:19:48", "iso8601_micro": "2024-09-20T21:19:48.003037Z", "iso8601": "2024-09-20T21:19:48Z", "iso8601_basic": "20240920T171948003037", "iso8601_basic_short": "20240920T171948", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 433, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796585472, "block_size": 4096, "block_total": 65519099, "block_available": 63915182, "block_used": 1603917, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867188.42362: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867188.42366: _low_level_execute_command(): starting 13131 1726867188.42368: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867187.4461274-13353-124549036106375/ > /dev/null 2>&1 && sleep 0' 13131 1726867188.42930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867188.42942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867188.42955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867188.42967: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867188.43039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867188.43057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867188.43082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867188.43096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867188.43174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867188.45863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867188.45867: stdout chunk (state=3): >>><<< 13131 1726867188.45973: stderr chunk (state=3): >>><<< 13131 1726867188.45979: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867188.45983: handler run complete 13131 1726867188.46050: variable 'ansible_facts' from source: unknown 13131 1726867188.46161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867188.46507: variable 'ansible_facts' from source: unknown 13131 1726867188.46606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867188.46765: attempt loop complete, returning result 13131 1726867188.46846: _execute() done 13131 1726867188.46851: dumping result to json 13131 1726867188.46853: done dumping result, returning 13131 1726867188.46855: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcac9-a3a5-5f24-9b7a-000000000218] 13131 1726867188.46857: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000218 ok: [managed_node1] 13131 1726867188.47798: no more pending results, returning what we have 13131 1726867188.47801: results queue empty 13131 1726867188.47802: checking for any_errors_fatal 13131 1726867188.47803: done checking for any_errors_fatal 13131 1726867188.47803: checking for max_fail_percentage 13131 1726867188.47805: done checking for max_fail_percentage 13131 1726867188.47806: checking to see if all hosts have failed and the running result is not ok 13131 1726867188.47807: done checking to see if all hosts have failed 13131 1726867188.47808: getting the remaining hosts for this loop 13131 1726867188.47809: done getting the remaining hosts for this loop 13131 1726867188.47812: getting the next task for host managed_node1 13131 1726867188.47817: done getting next task for host managed_node1 13131 1726867188.47819: ^ task is: TASK: meta (flush_handlers) 13131 1726867188.47821: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867188.47824: getting variables 13131 1726867188.47825: in VariableManager get_vars() 13131 1726867188.47864: Calling all_inventory to load vars for managed_node1 13131 1726867188.47876: Calling groups_inventory to load vars for managed_node1 13131 1726867188.47881: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867188.47886: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000218 13131 1726867188.47889: WORKER PROCESS EXITING 13131 1726867188.47899: Calling all_plugins_play to load vars for managed_node1 13131 1726867188.47902: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867188.47905: Calling groups_plugins_play to load vars for managed_node1 13131 1726867188.48085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867188.48290: done with get_vars() 13131 1726867188.48301: done getting variables 13131 1726867188.48372: in VariableManager get_vars() 13131 1726867188.48392: Calling all_inventory to load vars for managed_node1 13131 1726867188.48395: Calling groups_inventory to load vars for managed_node1 13131 1726867188.48397: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867188.48401: Calling all_plugins_play to load vars for managed_node1 13131 1726867188.48403: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867188.48406: Calling groups_plugins_play to load vars for managed_node1 13131 1726867188.48557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867188.48761: done with get_vars() 13131 1726867188.48774: done queuing things up, now waiting for results queue to drain 13131 1726867188.48775: results queue empty 13131 1726867188.48776: checking for any_errors_fatal 13131 1726867188.48782: done checking for any_errors_fatal 13131 1726867188.48783: checking for max_fail_percentage 13131 1726867188.48790: done checking for max_fail_percentage 13131 1726867188.48790: checking to see if all hosts have failed and the running result is not ok 13131 1726867188.48791: done checking to see if all hosts have failed 13131 1726867188.48792: getting the remaining hosts for this loop 13131 1726867188.48793: done getting the remaining hosts for this loop 13131 1726867188.48795: getting the next task for host managed_node1 13131 1726867188.48798: done getting next task for host managed_node1 13131 1726867188.48800: ^ task is: TASK: INIT Prepare setup 13131 1726867188.48802: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867188.48804: getting variables 13131 1726867188.48804: in VariableManager get_vars() 13131 1726867188.48821: Calling all_inventory to load vars for managed_node1 13131 1726867188.48824: Calling groups_inventory to load vars for managed_node1 13131 1726867188.48825: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867188.48830: Calling all_plugins_play to load vars for managed_node1 13131 1726867188.48832: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867188.48835: Calling groups_plugins_play to load vars for managed_node1 13131 1726867188.48990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867188.49215: done with get_vars() 13131 1726867188.49223: done getting variables 13131 1726867188.49305: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Friday 20 September 2024 17:19:48 -0400 (0:00:01.090) 0:00:03.603 ****** 13131 1726867188.49330: entering _queue_task() for managed_node1/debug 13131 1726867188.49332: Creating lock for debug 13131 1726867188.49742: worker is 1 (out of 1 available) 13131 1726867188.49752: exiting _queue_task() for managed_node1/debug 13131 1726867188.49762: done queuing things up, now waiting for results queue to drain 13131 1726867188.49763: waiting for pending results... 13131 1726867188.49922: running TaskExecutor() for managed_node1/TASK: INIT Prepare setup 13131 1726867188.50028: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000000b 13131 1726867188.50053: variable 'ansible_search_path' from source: unknown 13131 1726867188.50174: calling self._execute() 13131 1726867188.50211: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867188.50225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867188.50239: variable 'omit' from source: magic vars 13131 1726867188.50652: variable 'ansible_distribution_major_version' from source: facts 13131 1726867188.50668: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867188.50680: variable 'omit' from source: magic vars 13131 1726867188.50701: variable 'omit' from source: magic vars 13131 1726867188.50756: variable 'omit' from source: magic vars 13131 1726867188.50826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867188.50857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867188.50882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867188.50934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867188.50937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867188.50965: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867188.50975: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867188.50987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867188.51151: Set connection var ansible_connection to ssh 13131 1726867188.51155: Set connection var ansible_timeout to 10 13131 1726867188.51157: Set connection var ansible_shell_type to sh 13131 1726867188.51159: Set connection var ansible_shell_executable to /bin/sh 13131 1726867188.51161: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867188.51163: Set connection var ansible_pipelining to False 13131 1726867188.51182: variable 'ansible_shell_executable' from source: unknown 13131 1726867188.51190: variable 'ansible_connection' from source: unknown 13131 1726867188.51197: variable 'ansible_module_compression' from source: unknown 13131 1726867188.51203: variable 'ansible_shell_type' from source: unknown 13131 1726867188.51208: variable 'ansible_shell_executable' from source: unknown 13131 1726867188.51213: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867188.51220: variable 'ansible_pipelining' from source: unknown 13131 1726867188.51225: variable 'ansible_timeout' from source: unknown 13131 1726867188.51232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867188.51393: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867188.51479: variable 'omit' from source: magic vars 13131 1726867188.51484: starting attempt loop 13131 1726867188.51487: running the handler 13131 1726867188.51489: handler run complete 13131 1726867188.51520: attempt loop complete, returning result 13131 1726867188.51527: _execute() done 13131 1726867188.51534: dumping result to json 13131 1726867188.51540: done dumping result, returning 13131 1726867188.51551: done running TaskExecutor() for managed_node1/TASK: INIT Prepare setup [0affcac9-a3a5-5f24-9b7a-00000000000b] 13131 1726867188.51558: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000000b ok: [managed_node1] => {} MSG: ################################################## 13131 1726867188.51738: no more pending results, returning what we have 13131 1726867188.51741: results queue empty 13131 1726867188.51742: checking for any_errors_fatal 13131 1726867188.51743: done checking for any_errors_fatal 13131 1726867188.51744: checking for max_fail_percentage 13131 1726867188.51746: done checking for max_fail_percentage 13131 1726867188.51746: checking to see if all hosts have failed and the running result is not ok 13131 1726867188.51747: done checking to see if all hosts have failed 13131 1726867188.51748: getting the remaining hosts for this loop 13131 1726867188.51749: done getting the remaining hosts for this loop 13131 1726867188.51752: getting the next task for host managed_node1 13131 1726867188.51758: done getting next task for host managed_node1 13131 1726867188.51760: ^ task is: TASK: Install dnsmasq 13131 1726867188.51763: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867188.51767: getting variables 13131 1726867188.51769: in VariableManager get_vars() 13131 1726867188.51818: Calling all_inventory to load vars for managed_node1 13131 1726867188.51821: Calling groups_inventory to load vars for managed_node1 13131 1726867188.51823: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867188.51833: Calling all_plugins_play to load vars for managed_node1 13131 1726867188.51836: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867188.51839: Calling groups_plugins_play to load vars for managed_node1 13131 1726867188.52240: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000000b 13131 1726867188.52243: WORKER PROCESS EXITING 13131 1726867188.52263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867188.52480: done with get_vars() 13131 1726867188.52489: done getting variables 13131 1726867188.52552: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 17:19:48 -0400 (0:00:00.032) 0:00:03.636 ****** 13131 1726867188.52582: entering _queue_task() for managed_node1/package 13131 1726867188.52828: worker is 1 (out of 1 available) 13131 1726867188.52839: exiting _queue_task() for managed_node1/package 13131 1726867188.52850: done queuing things up, now waiting for results queue to drain 13131 1726867188.52851: waiting for pending results... 13131 1726867188.53108: running TaskExecutor() for managed_node1/TASK: Install dnsmasq 13131 1726867188.53240: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000000f 13131 1726867188.53261: variable 'ansible_search_path' from source: unknown 13131 1726867188.53268: variable 'ansible_search_path' from source: unknown 13131 1726867188.53319: calling self._execute() 13131 1726867188.53405: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867188.53483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867188.53486: variable 'omit' from source: magic vars 13131 1726867188.53823: variable 'ansible_distribution_major_version' from source: facts 13131 1726867188.53852: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867188.53867: variable 'omit' from source: magic vars 13131 1726867188.53920: variable 'omit' from source: magic vars 13131 1726867188.54198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867188.56360: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867188.56429: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867188.56683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867188.56686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867188.56688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867188.56691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867188.56693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867188.56710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867188.56754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867188.56774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867188.56886: variable '__network_is_ostree' from source: set_fact 13131 1726867188.56897: variable 'omit' from source: magic vars 13131 1726867188.56939: variable 'omit' from source: magic vars 13131 1726867188.56969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867188.57003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867188.57038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867188.57062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867188.57076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867188.57114: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867188.57140: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867188.57244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867188.57247: Set connection var ansible_connection to ssh 13131 1726867188.57261: Set connection var ansible_timeout to 10 13131 1726867188.57269: Set connection var ansible_shell_type to sh 13131 1726867188.57284: Set connection var ansible_shell_executable to /bin/sh 13131 1726867188.57298: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867188.57306: Set connection var ansible_pipelining to False 13131 1726867188.57331: variable 'ansible_shell_executable' from source: unknown 13131 1726867188.57337: variable 'ansible_connection' from source: unknown 13131 1726867188.57343: variable 'ansible_module_compression' from source: unknown 13131 1726867188.57360: variable 'ansible_shell_type' from source: unknown 13131 1726867188.57366: variable 'ansible_shell_executable' from source: unknown 13131 1726867188.57372: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867188.57381: variable 'ansible_pipelining' from source: unknown 13131 1726867188.57462: variable 'ansible_timeout' from source: unknown 13131 1726867188.57466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867188.57503: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867188.57571: variable 'omit' from source: magic vars 13131 1726867188.57576: starting attempt loop 13131 1726867188.57584: running the handler 13131 1726867188.57586: variable 'ansible_facts' from source: unknown 13131 1726867188.57588: variable 'ansible_facts' from source: unknown 13131 1726867188.57590: _low_level_execute_command(): starting 13131 1726867188.57599: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867188.58748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867188.58808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867188.58859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867188.58918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867188.58924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867188.58989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867188.61496: stdout chunk (state=3): >>>/root <<< 13131 1726867188.61539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867188.61827: stderr chunk (state=3): >>><<< 13131 1726867188.61831: stdout chunk (state=3): >>><<< 13131 1726867188.61835: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867188.61848: _low_level_execute_command(): starting 13131 1726867188.61851: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380 `" && echo ansible-tmp-1726867188.6174872-13404-78506417435380="` echo /root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380 `" ) && sleep 0' 13131 1726867188.62606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867188.62612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867188.62629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867188.62651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867188.62655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867188.62667: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867188.62672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867188.62745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867188.62763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867188.62766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867188.62843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867188.65620: stdout chunk (state=3): >>>ansible-tmp-1726867188.6174872-13404-78506417435380=/root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380 <<< 13131 1726867188.65841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867188.65844: stdout chunk (state=3): >>><<< 13131 1726867188.65846: stderr chunk (state=3): >>><<< 13131 1726867188.66083: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867188.6174872-13404-78506417435380=/root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13131 1726867188.66087: variable 'ansible_module_compression' from source: unknown 13131 1726867188.66090: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 13131 1726867188.66092: ANSIBALLZ: Acquiring lock 13131 1726867188.66094: ANSIBALLZ: Lock acquired: 140192901613856 13131 1726867188.66096: ANSIBALLZ: Creating module 13131 1726867188.81172: ANSIBALLZ: Writing module into payload 13131 1726867188.81384: ANSIBALLZ: Writing module 13131 1726867188.81390: ANSIBALLZ: Renaming module 13131 1726867188.81401: ANSIBALLZ: Done creating module 13131 1726867188.81423: variable 'ansible_facts' from source: unknown 13131 1726867188.81584: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/AnsiballZ_dnf.py 13131 1726867188.81817: Sending initial data 13131 1726867188.81826: Sent initial data (151 bytes) 13131 1726867188.82528: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867188.82541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867188.82602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867188.82620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867188.82639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867188.82711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13131 1726867188.85137: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867188.85141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867188.85448: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp9m25sr41 /root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/AnsiballZ_dnf.py <<< 13131 1726867188.85452: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/AnsiballZ_dnf.py" <<< 13131 1726867188.85644: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp9m25sr41" to remote "/root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/AnsiballZ_dnf.py" <<< 13131 1726867188.87299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867188.87303: stdout chunk (state=3): >>><<< 13131 1726867188.87309: stderr chunk (state=3): >>><<< 13131 1726867188.87428: done transferring module to remote 13131 1726867188.87431: _low_level_execute_command(): starting 13131 1726867188.87434: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/ /root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/AnsiballZ_dnf.py && sleep 0' 13131 1726867188.88562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867188.88699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867188.88726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867188.88802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867188.90754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867188.90810: stderr chunk (state=3): >>><<< 13131 1726867188.90822: stdout chunk (state=3): >>><<< 13131 1726867188.90927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867188.90931: _low_level_execute_command(): starting 13131 1726867188.90934: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/AnsiballZ_dnf.py && sleep 0' 13131 1726867188.92174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867188.92233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867188.92308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867188.92325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867188.92441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867189.47613: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13131 1726867189.51692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867189.51697: stdout chunk (state=3): >>><<< 13131 1726867189.51699: stderr chunk (state=3): >>><<< 13131 1726867189.51719: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867189.52023: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867189.52032: _low_level_execute_command(): starting 13131 1726867189.52035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867188.6174872-13404-78506417435380/ > /dev/null 2>&1 && sleep 0' 13131 1726867189.53304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867189.53323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867189.53341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867189.53422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867189.55276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867189.55282: stdout chunk (state=3): >>><<< 13131 1726867189.55285: stderr chunk (state=3): >>><<< 13131 1726867189.55482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867189.55485: handler run complete 13131 1726867189.55488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867189.56088: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867189.56091: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867189.56093: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867189.56096: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867189.56146: variable '__install_status' from source: unknown 13131 1726867189.56166: Evaluated conditional (__install_status is success): True 13131 1726867189.56184: attempt loop complete, returning result 13131 1726867189.56187: _execute() done 13131 1726867189.56203: dumping result to json 13131 1726867189.56210: done dumping result, returning 13131 1726867189.56218: done running TaskExecutor() for managed_node1/TASK: Install dnsmasq [0affcac9-a3a5-5f24-9b7a-00000000000f] 13131 1726867189.56221: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000000f 13131 1726867189.56338: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000000f 13131 1726867189.56341: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13131 1726867189.56444: no more pending results, returning what we have 13131 1726867189.56447: results queue empty 13131 1726867189.56449: checking for any_errors_fatal 13131 1726867189.56461: done checking for any_errors_fatal 13131 1726867189.56462: checking for max_fail_percentage 13131 1726867189.56464: done checking for max_fail_percentage 13131 1726867189.56465: checking to see if all hosts have failed and the running result is not ok 13131 1726867189.56465: done checking to see if all hosts have failed 13131 1726867189.56466: getting the remaining hosts for this loop 13131 1726867189.56468: done getting the remaining hosts for this loop 13131 1726867189.56471: getting the next task for host managed_node1 13131 1726867189.56480: done getting next task for host managed_node1 13131 1726867189.56483: ^ task is: TASK: Install pgrep, sysctl 13131 1726867189.56486: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867189.56489: getting variables 13131 1726867189.56493: in VariableManager get_vars() 13131 1726867189.56905: Calling all_inventory to load vars for managed_node1 13131 1726867189.56908: Calling groups_inventory to load vars for managed_node1 13131 1726867189.56911: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867189.56921: Calling all_plugins_play to load vars for managed_node1 13131 1726867189.56924: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867189.56926: Calling groups_plugins_play to load vars for managed_node1 13131 1726867189.57426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867189.57642: done with get_vars() 13131 1726867189.57652: done getting variables 13131 1726867189.57712: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 17:19:49 -0400 (0:00:01.051) 0:00:04.688 ****** 13131 1726867189.57741: entering _queue_task() for managed_node1/package 13131 1726867189.58031: worker is 1 (out of 1 available) 13131 1726867189.58043: exiting _queue_task() for managed_node1/package 13131 1726867189.58054: done queuing things up, now waiting for results queue to drain 13131 1726867189.58055: waiting for pending results... 13131 1726867189.58307: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 13131 1726867189.58432: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000010 13131 1726867189.58451: variable 'ansible_search_path' from source: unknown 13131 1726867189.58458: variable 'ansible_search_path' from source: unknown 13131 1726867189.58504: calling self._execute() 13131 1726867189.58600: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867189.58626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867189.58655: variable 'omit' from source: magic vars 13131 1726867189.59363: variable 'ansible_distribution_major_version' from source: facts 13131 1726867189.59400: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867189.59704: variable 'ansible_os_family' from source: facts 13131 1726867189.59714: Evaluated conditional (ansible_os_family == 'RedHat'): True 13131 1726867189.60083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867189.60525: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867189.60583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867189.60625: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867189.60708: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867189.60797: variable 'ansible_distribution_major_version' from source: facts 13131 1726867189.60815: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 13131 1726867189.60822: when evaluation is False, skipping this task 13131 1726867189.60829: _execute() done 13131 1726867189.60837: dumping result to json 13131 1726867189.60845: done dumping result, returning 13131 1726867189.60855: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0affcac9-a3a5-5f24-9b7a-000000000010] 13131 1726867189.60863: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000010 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 13131 1726867189.61041: no more pending results, returning what we have 13131 1726867189.61045: results queue empty 13131 1726867189.61046: checking for any_errors_fatal 13131 1726867189.61052: done checking for any_errors_fatal 13131 1726867189.61053: checking for max_fail_percentage 13131 1726867189.61054: done checking for max_fail_percentage 13131 1726867189.61055: checking to see if all hosts have failed and the running result is not ok 13131 1726867189.61056: done checking to see if all hosts have failed 13131 1726867189.61056: getting the remaining hosts for this loop 13131 1726867189.61058: done getting the remaining hosts for this loop 13131 1726867189.61061: getting the next task for host managed_node1 13131 1726867189.61069: done getting next task for host managed_node1 13131 1726867189.61071: ^ task is: TASK: Install pgrep, sysctl 13131 1726867189.61074: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867189.61210: getting variables 13131 1726867189.61213: in VariableManager get_vars() 13131 1726867189.61267: Calling all_inventory to load vars for managed_node1 13131 1726867189.61270: Calling groups_inventory to load vars for managed_node1 13131 1726867189.61273: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867189.61325: Calling all_plugins_play to load vars for managed_node1 13131 1726867189.61329: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867189.61334: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000010 13131 1726867189.61337: WORKER PROCESS EXITING 13131 1726867189.61341: Calling groups_plugins_play to load vars for managed_node1 13131 1726867189.61625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867189.61841: done with get_vars() 13131 1726867189.61851: done getting variables 13131 1726867189.61918: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 17:19:49 -0400 (0:00:00.042) 0:00:04.730 ****** 13131 1726867189.61946: entering _queue_task() for managed_node1/package 13131 1726867189.62285: worker is 1 (out of 1 available) 13131 1726867189.62359: exiting _queue_task() for managed_node1/package 13131 1726867189.62375: done queuing things up, now waiting for results queue to drain 13131 1726867189.62378: waiting for pending results... 13131 1726867189.62567: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 13131 1726867189.62704: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000011 13131 1726867189.62723: variable 'ansible_search_path' from source: unknown 13131 1726867189.62731: variable 'ansible_search_path' from source: unknown 13131 1726867189.62781: calling self._execute() 13131 1726867189.62880: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867189.62896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867189.62911: variable 'omit' from source: magic vars 13131 1726867189.63546: variable 'ansible_distribution_major_version' from source: facts 13131 1726867189.63620: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867189.63700: variable 'ansible_os_family' from source: facts 13131 1726867189.63711: Evaluated conditional (ansible_os_family == 'RedHat'): True 13131 1726867189.63905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867189.64232: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867189.64294: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867189.64353: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867189.64426: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867189.64583: variable 'ansible_distribution_major_version' from source: facts 13131 1726867189.64586: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 13131 1726867189.64589: variable 'omit' from source: magic vars 13131 1726867189.64612: variable 'omit' from source: magic vars 13131 1726867189.64773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867189.69250: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867189.69412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867189.69485: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867189.69583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867189.69614: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867189.69827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867189.69904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867189.69930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867189.70118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867189.70121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867189.70260: variable '__network_is_ostree' from source: set_fact 13131 1726867189.70344: variable 'omit' from source: magic vars 13131 1726867189.70376: variable 'omit' from source: magic vars 13131 1726867189.70419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867189.70519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867189.70584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867189.70615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867189.70721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867189.70725: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867189.70727: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867189.70729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867189.71003: Set connection var ansible_connection to ssh 13131 1726867189.71032: Set connection var ansible_timeout to 10 13131 1726867189.71055: Set connection var ansible_shell_type to sh 13131 1726867189.71098: Set connection var ansible_shell_executable to /bin/sh 13131 1726867189.71182: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867189.71186: Set connection var ansible_pipelining to False 13131 1726867189.71189: variable 'ansible_shell_executable' from source: unknown 13131 1726867189.71191: variable 'ansible_connection' from source: unknown 13131 1726867189.71193: variable 'ansible_module_compression' from source: unknown 13131 1726867189.71200: variable 'ansible_shell_type' from source: unknown 13131 1726867189.71223: variable 'ansible_shell_executable' from source: unknown 13131 1726867189.71232: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867189.71242: variable 'ansible_pipelining' from source: unknown 13131 1726867189.71249: variable 'ansible_timeout' from source: unknown 13131 1726867189.71258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867189.71421: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867189.71425: variable 'omit' from source: magic vars 13131 1726867189.71427: starting attempt loop 13131 1726867189.71431: running the handler 13131 1726867189.71433: variable 'ansible_facts' from source: unknown 13131 1726867189.71435: variable 'ansible_facts' from source: unknown 13131 1726867189.71487: _low_level_execute_command(): starting 13131 1726867189.71531: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867189.72265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867189.72361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867189.72402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867189.72433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867189.72552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867189.74215: stdout chunk (state=3): >>>/root <<< 13131 1726867189.74408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867189.74412: stdout chunk (state=3): >>><<< 13131 1726867189.74414: stderr chunk (state=3): >>><<< 13131 1726867189.74416: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867189.74419: _low_level_execute_command(): starting 13131 1726867189.74421: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972 `" && echo ansible-tmp-1726867189.7438211-13469-87349493772972="` echo /root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972 `" ) && sleep 0' 13131 1726867189.74983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867189.74987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867189.74989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867189.74995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867189.74998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867189.75000: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867189.75002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867189.75005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867189.75007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867189.75009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867189.75055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867189.75058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867189.75060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867189.75062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867189.75064: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867189.75066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867189.75189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867189.75194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867189.75197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867189.75406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867189.77416: stdout chunk (state=3): >>>ansible-tmp-1726867189.7438211-13469-87349493772972=/root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972 <<< 13131 1726867189.77459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867189.77462: stdout chunk (state=3): >>><<< 13131 1726867189.77586: stderr chunk (state=3): >>><<< 13131 1726867189.77590: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867189.7438211-13469-87349493772972=/root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867189.77595: variable 'ansible_module_compression' from source: unknown 13131 1726867189.77598: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 13131 1726867189.77603: variable 'ansible_facts' from source: unknown 13131 1726867189.77932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/AnsiballZ_dnf.py 13131 1726867189.78208: Sending initial data 13131 1726867189.78211: Sent initial data (151 bytes) 13131 1726867189.79365: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867189.79385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867189.79416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867189.79457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867189.80981: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867189.81023: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867189.81070: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpcr6tgwsk /root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/AnsiballZ_dnf.py <<< 13131 1726867189.81116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpcr6tgwsk" to remote "/root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/AnsiballZ_dnf.py" <<< 13131 1726867189.81129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/AnsiballZ_dnf.py" <<< 13131 1726867189.83023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867189.83026: stderr chunk (state=3): >>><<< 13131 1726867189.83029: stdout chunk (state=3): >>><<< 13131 1726867189.83031: done transferring module to remote 13131 1726867189.83034: _low_level_execute_command(): starting 13131 1726867189.83036: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/ /root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/AnsiballZ_dnf.py && sleep 0' 13131 1726867189.84361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867189.84489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867189.84511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867189.84523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867189.84605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867189.86354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867189.86415: stderr chunk (state=3): >>><<< 13131 1726867189.86423: stdout chunk (state=3): >>><<< 13131 1726867189.86520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867189.86523: _low_level_execute_command(): starting 13131 1726867189.86525: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/AnsiballZ_dnf.py && sleep 0' 13131 1726867189.87843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867189.88003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867189.88030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867189.88052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867189.88381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867190.28705: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13131 1726867190.32880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867190.32884: stdout chunk (state=3): >>><<< 13131 1726867190.32887: stderr chunk (state=3): >>><<< 13131 1726867190.33107: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867190.33115: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867190.33118: _low_level_execute_command(): starting 13131 1726867190.33121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867189.7438211-13469-87349493772972/ > /dev/null 2>&1 && sleep 0' 13131 1726867190.33681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867190.33760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867190.33809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867190.33833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867190.33857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867190.33936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867190.35867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867190.35876: stdout chunk (state=3): >>><<< 13131 1726867190.35889: stderr chunk (state=3): >>><<< 13131 1726867190.35907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867190.35951: handler run complete 13131 1726867190.35973: attempt loop complete, returning result 13131 1726867190.35983: _execute() done 13131 1726867190.35990: dumping result to json 13131 1726867190.35999: done dumping result, returning 13131 1726867190.36010: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0affcac9-a3a5-5f24-9b7a-000000000011] 13131 1726867190.36057: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000011 13131 1726867190.36225: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000011 13131 1726867190.36228: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13131 1726867190.36312: no more pending results, returning what we have 13131 1726867190.36315: results queue empty 13131 1726867190.36316: checking for any_errors_fatal 13131 1726867190.36323: done checking for any_errors_fatal 13131 1726867190.36324: checking for max_fail_percentage 13131 1726867190.36326: done checking for max_fail_percentage 13131 1726867190.36327: checking to see if all hosts have failed and the running result is not ok 13131 1726867190.36328: done checking to see if all hosts have failed 13131 1726867190.36328: getting the remaining hosts for this loop 13131 1726867190.36330: done getting the remaining hosts for this loop 13131 1726867190.36333: getting the next task for host managed_node1 13131 1726867190.36470: done getting next task for host managed_node1 13131 1726867190.36473: ^ task is: TASK: Create test interfaces 13131 1726867190.36476: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867190.36482: getting variables 13131 1726867190.36483: in VariableManager get_vars() 13131 1726867190.36714: Calling all_inventory to load vars for managed_node1 13131 1726867190.36717: Calling groups_inventory to load vars for managed_node1 13131 1726867190.36720: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867190.36730: Calling all_plugins_play to load vars for managed_node1 13131 1726867190.36732: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867190.36735: Calling groups_plugins_play to load vars for managed_node1 13131 1726867190.37063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867190.37411: done with get_vars() 13131 1726867190.37421: done getting variables 13131 1726867190.37552: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 17:19:50 -0400 (0:00:00.756) 0:00:05.486 ****** 13131 1726867190.37585: entering _queue_task() for managed_node1/shell 13131 1726867190.37587: Creating lock for shell 13131 1726867190.38037: worker is 1 (out of 1 available) 13131 1726867190.38049: exiting _queue_task() for managed_node1/shell 13131 1726867190.38061: done queuing things up, now waiting for results queue to drain 13131 1726867190.38062: waiting for pending results... 13131 1726867190.38597: running TaskExecutor() for managed_node1/TASK: Create test interfaces 13131 1726867190.38606: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000012 13131 1726867190.38743: variable 'ansible_search_path' from source: unknown 13131 1726867190.38747: variable 'ansible_search_path' from source: unknown 13131 1726867190.38812: calling self._execute() 13131 1726867190.39040: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867190.39043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867190.39049: variable 'omit' from source: magic vars 13131 1726867190.39582: variable 'ansible_distribution_major_version' from source: facts 13131 1726867190.39585: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867190.39588: variable 'omit' from source: magic vars 13131 1726867190.39724: variable 'omit' from source: magic vars 13131 1726867190.40571: variable 'dhcp_interface1' from source: play vars 13131 1726867190.40585: variable 'dhcp_interface2' from source: play vars 13131 1726867190.40682: variable 'omit' from source: magic vars 13131 1726867190.40782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867190.40785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867190.40800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867190.40855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867190.40870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867190.40906: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867190.40914: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867190.40921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867190.41194: Set connection var ansible_connection to ssh 13131 1726867190.41272: Set connection var ansible_timeout to 10 13131 1726867190.41281: Set connection var ansible_shell_type to sh 13131 1726867190.41283: Set connection var ansible_shell_executable to /bin/sh 13131 1726867190.41320: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867190.41415: Set connection var ansible_pipelining to False 13131 1726867190.41419: variable 'ansible_shell_executable' from source: unknown 13131 1726867190.41421: variable 'ansible_connection' from source: unknown 13131 1726867190.41423: variable 'ansible_module_compression' from source: unknown 13131 1726867190.41425: variable 'ansible_shell_type' from source: unknown 13131 1726867190.41427: variable 'ansible_shell_executable' from source: unknown 13131 1726867190.41429: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867190.41431: variable 'ansible_pipelining' from source: unknown 13131 1726867190.41432: variable 'ansible_timeout' from source: unknown 13131 1726867190.41435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867190.41654: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867190.41697: variable 'omit' from source: magic vars 13131 1726867190.41715: starting attempt loop 13131 1726867190.41722: running the handler 13131 1726867190.41739: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867190.41822: _low_level_execute_command(): starting 13131 1726867190.41825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867190.42728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867190.42815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867190.42844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867190.42934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867190.44547: stdout chunk (state=3): >>>/root <<< 13131 1726867190.44748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867190.44754: stdout chunk (state=3): >>><<< 13131 1726867190.44757: stderr chunk (state=3): >>><<< 13131 1726867190.44984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867190.44990: _low_level_execute_command(): starting 13131 1726867190.44994: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875 `" && echo ansible-tmp-1726867190.4486322-13518-192087061734875="` echo /root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875 `" ) && sleep 0' 13131 1726867190.46129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867190.46195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867190.46213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867190.46242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867190.46473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867190.48223: stdout chunk (state=3): >>>ansible-tmp-1726867190.4486322-13518-192087061734875=/root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875 <<< 13131 1726867190.48349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867190.48388: stderr chunk (state=3): >>><<< 13131 1726867190.48407: stdout chunk (state=3): >>><<< 13131 1726867190.48445: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867190.4486322-13518-192087061734875=/root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867190.48468: variable 'ansible_module_compression' from source: unknown 13131 1726867190.48532: ANSIBALLZ: Using generic lock for ansible.legacy.command 13131 1726867190.48553: ANSIBALLZ: Acquiring lock 13131 1726867190.48556: ANSIBALLZ: Lock acquired: 140192901613856 13131 1726867190.48659: ANSIBALLZ: Creating module 13131 1726867190.68712: ANSIBALLZ: Writing module into payload 13131 1726867190.68916: ANSIBALLZ: Writing module 13131 1726867190.68935: ANSIBALLZ: Renaming module 13131 1726867190.68948: ANSIBALLZ: Done creating module 13131 1726867190.69082: variable 'ansible_facts' from source: unknown 13131 1726867190.69158: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/AnsiballZ_command.py 13131 1726867190.69420: Sending initial data 13131 1726867190.69423: Sent initial data (156 bytes) 13131 1726867190.70853: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867190.71122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867190.71362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867190.71556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867190.73089: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867190.73194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867190.73311: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp5e5mv6xn /root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/AnsiballZ_command.py <<< 13131 1726867190.73320: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/AnsiballZ_command.py" <<< 13131 1726867190.73412: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp5e5mv6xn" to remote "/root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/AnsiballZ_command.py" <<< 13131 1726867190.75321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867190.75324: stderr chunk (state=3): >>><<< 13131 1726867190.75327: stdout chunk (state=3): >>><<< 13131 1726867190.75495: done transferring module to remote 13131 1726867190.75505: _low_level_execute_command(): starting 13131 1726867190.75508: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/ /root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/AnsiballZ_command.py && sleep 0' 13131 1726867190.76735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867190.76738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867190.76741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867190.76744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867190.76746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867190.76749: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867190.76751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867190.76753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867190.76755: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867190.76760: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867190.76768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867190.76955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867190.76996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867190.77115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867190.78933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867190.78937: stderr chunk (state=3): >>><<< 13131 1726867190.78939: stdout chunk (state=3): >>><<< 13131 1726867190.78942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867190.78944: _low_level_execute_command(): starting 13131 1726867190.78947: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/AnsiballZ_command.py && sleep 0' 13131 1726867190.80516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867190.80753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867190.80756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867190.80759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867190.80762: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867190.80764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867190.81081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867190.81085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867190.81111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867190.81181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.18320: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 700 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 700 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 17:19:50.961882", "end": "2024-09-20 17:19:52.180351", "delta": "0:00:01.218469", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867192.19885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867192.19914: stdout chunk (state=3): >>><<< 13131 1726867192.19918: stderr chunk (state=3): >>><<< 13131 1726867192.20217: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 700 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 700 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 17:19:50.961882", "end": "2024-09-20 17:19:52.180351", "delta": "0:00:01.218469", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867192.20225: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867192.20228: _low_level_execute_command(): starting 13131 1726867192.20230: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867190.4486322-13518-192087061734875/ > /dev/null 2>&1 && sleep 0' 13131 1726867192.21416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.21430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.21527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.21687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.21756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.21868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.23745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.23755: stdout chunk (state=3): >>><<< 13131 1726867192.23767: stderr chunk (state=3): >>><<< 13131 1726867192.23802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867192.23995: handler run complete 13131 1726867192.23998: Evaluated conditional (False): False 13131 1726867192.24000: attempt loop complete, returning result 13131 1726867192.24004: _execute() done 13131 1726867192.24006: dumping result to json 13131 1726867192.24008: done dumping result, returning 13131 1726867192.24010: done running TaskExecutor() for managed_node1/TASK: Create test interfaces [0affcac9-a3a5-5f24-9b7a-000000000012] 13131 1726867192.24012: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000012 13131 1726867192.24526: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000012 13131 1726867192.24529: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.218469", "end": "2024-09-20 17:19:52.180351", "rc": 0, "start": "2024-09-20 17:19:50.961882" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 700 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 700 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 13131 1726867192.24612: no more pending results, returning what we have 13131 1726867192.24615: results queue empty 13131 1726867192.24616: checking for any_errors_fatal 13131 1726867192.24623: done checking for any_errors_fatal 13131 1726867192.24624: checking for max_fail_percentage 13131 1726867192.24625: done checking for max_fail_percentage 13131 1726867192.24626: checking to see if all hosts have failed and the running result is not ok 13131 1726867192.24627: done checking to see if all hosts have failed 13131 1726867192.24627: getting the remaining hosts for this loop 13131 1726867192.24629: done getting the remaining hosts for this loop 13131 1726867192.24632: getting the next task for host managed_node1 13131 1726867192.24642: done getting next task for host managed_node1 13131 1726867192.24645: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13131 1726867192.24648: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867192.24651: getting variables 13131 1726867192.24652: in VariableManager get_vars() 13131 1726867192.24706: Calling all_inventory to load vars for managed_node1 13131 1726867192.24709: Calling groups_inventory to load vars for managed_node1 13131 1726867192.24712: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867192.24723: Calling all_plugins_play to load vars for managed_node1 13131 1726867192.24726: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867192.24729: Calling groups_plugins_play to load vars for managed_node1 13131 1726867192.25313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867192.25717: done with get_vars() 13131 1726867192.25728: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:19:52 -0400 (0:00:01.883) 0:00:07.369 ****** 13131 1726867192.25936: entering _queue_task() for managed_node1/include_tasks 13131 1726867192.26650: worker is 1 (out of 1 available) 13131 1726867192.26663: exiting _queue_task() for managed_node1/include_tasks 13131 1726867192.26674: done queuing things up, now waiting for results queue to drain 13131 1726867192.26676: waiting for pending results... 13131 1726867192.26834: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 13131 1726867192.26983: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000016 13131 1726867192.26988: variable 'ansible_search_path' from source: unknown 13131 1726867192.26993: variable 'ansible_search_path' from source: unknown 13131 1726867192.27038: calling self._execute() 13131 1726867192.27182: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.27186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.27188: variable 'omit' from source: magic vars 13131 1726867192.27553: variable 'ansible_distribution_major_version' from source: facts 13131 1726867192.27572: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867192.27585: _execute() done 13131 1726867192.27596: dumping result to json 13131 1726867192.27603: done dumping result, returning 13131 1726867192.27613: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-5f24-9b7a-000000000016] 13131 1726867192.27623: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000016 13131 1726867192.27902: no more pending results, returning what we have 13131 1726867192.27906: in VariableManager get_vars() 13131 1726867192.27961: Calling all_inventory to load vars for managed_node1 13131 1726867192.27964: Calling groups_inventory to load vars for managed_node1 13131 1726867192.27967: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867192.27981: Calling all_plugins_play to load vars for managed_node1 13131 1726867192.27985: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867192.27989: Calling groups_plugins_play to load vars for managed_node1 13131 1726867192.28262: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000016 13131 1726867192.28265: WORKER PROCESS EXITING 13131 1726867192.28289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867192.28501: done with get_vars() 13131 1726867192.28508: variable 'ansible_search_path' from source: unknown 13131 1726867192.28509: variable 'ansible_search_path' from source: unknown 13131 1726867192.28549: we have included files to process 13131 1726867192.28550: generating all_blocks data 13131 1726867192.28551: done generating all_blocks data 13131 1726867192.28552: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867192.28553: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867192.28555: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867192.28775: done processing included file 13131 1726867192.28779: iterating over new_blocks loaded from include file 13131 1726867192.28780: in VariableManager get_vars() 13131 1726867192.28806: done with get_vars() 13131 1726867192.28808: filtering new block on tags 13131 1726867192.28822: done filtering new block on tags 13131 1726867192.28824: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 13131 1726867192.28828: extending task lists for all hosts with included blocks 13131 1726867192.28925: done extending task lists 13131 1726867192.28927: done processing included files 13131 1726867192.28927: results queue empty 13131 1726867192.28928: checking for any_errors_fatal 13131 1726867192.28934: done checking for any_errors_fatal 13131 1726867192.28934: checking for max_fail_percentage 13131 1726867192.28935: done checking for max_fail_percentage 13131 1726867192.28936: checking to see if all hosts have failed and the running result is not ok 13131 1726867192.28937: done checking to see if all hosts have failed 13131 1726867192.28938: getting the remaining hosts for this loop 13131 1726867192.28938: done getting the remaining hosts for this loop 13131 1726867192.28941: getting the next task for host managed_node1 13131 1726867192.28944: done getting next task for host managed_node1 13131 1726867192.28952: ^ task is: TASK: Get stat for interface {{ interface }} 13131 1726867192.28955: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867192.28957: getting variables 13131 1726867192.28958: in VariableManager get_vars() 13131 1726867192.28975: Calling all_inventory to load vars for managed_node1 13131 1726867192.28979: Calling groups_inventory to load vars for managed_node1 13131 1726867192.28982: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867192.28986: Calling all_plugins_play to load vars for managed_node1 13131 1726867192.28988: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867192.28994: Calling groups_plugins_play to load vars for managed_node1 13131 1726867192.29173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867192.29366: done with get_vars() 13131 1726867192.29375: done getting variables 13131 1726867192.29539: variable 'interface' from source: task vars 13131 1726867192.29543: variable 'dhcp_interface1' from source: play vars 13131 1726867192.29620: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:19:52 -0400 (0:00:00.037) 0:00:07.407 ****** 13131 1726867192.29662: entering _queue_task() for managed_node1/stat 13131 1726867192.29919: worker is 1 (out of 1 available) 13131 1726867192.30045: exiting _queue_task() for managed_node1/stat 13131 1726867192.30054: done queuing things up, now waiting for results queue to drain 13131 1726867192.30055: waiting for pending results... 13131 1726867192.30204: running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 13131 1726867192.30338: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000248 13131 1726867192.30363: variable 'ansible_search_path' from source: unknown 13131 1726867192.30373: variable 'ansible_search_path' from source: unknown 13131 1726867192.30419: calling self._execute() 13131 1726867192.30516: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.30526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.30537: variable 'omit' from source: magic vars 13131 1726867192.31035: variable 'ansible_distribution_major_version' from source: facts 13131 1726867192.31234: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867192.31239: variable 'omit' from source: magic vars 13131 1726867192.31587: variable 'omit' from source: magic vars 13131 1726867192.31593: variable 'interface' from source: task vars 13131 1726867192.31596: variable 'dhcp_interface1' from source: play vars 13131 1726867192.31752: variable 'dhcp_interface1' from source: play vars 13131 1726867192.31822: variable 'omit' from source: magic vars 13131 1726867192.31862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867192.32009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867192.32130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867192.32136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867192.32139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867192.32141: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867192.32143: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.32145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.32240: Set connection var ansible_connection to ssh 13131 1726867192.32253: Set connection var ansible_timeout to 10 13131 1726867192.32259: Set connection var ansible_shell_type to sh 13131 1726867192.32271: Set connection var ansible_shell_executable to /bin/sh 13131 1726867192.32287: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867192.32300: Set connection var ansible_pipelining to False 13131 1726867192.32325: variable 'ansible_shell_executable' from source: unknown 13131 1726867192.32334: variable 'ansible_connection' from source: unknown 13131 1726867192.32348: variable 'ansible_module_compression' from source: unknown 13131 1726867192.32355: variable 'ansible_shell_type' from source: unknown 13131 1726867192.32363: variable 'ansible_shell_executable' from source: unknown 13131 1726867192.32375: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.32388: variable 'ansible_pipelining' from source: unknown 13131 1726867192.32399: variable 'ansible_timeout' from source: unknown 13131 1726867192.32407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.32618: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867192.32632: variable 'omit' from source: magic vars 13131 1726867192.32641: starting attempt loop 13131 1726867192.32647: running the handler 13131 1726867192.32672: _low_level_execute_command(): starting 13131 1726867192.32785: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867192.33415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.33496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.33553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867192.33571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.33601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.33686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.35523: stdout chunk (state=3): >>>/root <<< 13131 1726867192.35730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.35739: stdout chunk (state=3): >>><<< 13131 1726867192.35744: stderr chunk (state=3): >>><<< 13131 1726867192.35766: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867192.35785: _low_level_execute_command(): starting 13131 1726867192.35794: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104 `" && echo ansible-tmp-1726867192.3576565-13614-260611087409104="` echo /root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104 `" ) && sleep 0' 13131 1726867192.36337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.36433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.36445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.36524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.38446: stdout chunk (state=3): >>>ansible-tmp-1726867192.3576565-13614-260611087409104=/root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104 <<< 13131 1726867192.38514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.38550: stderr chunk (state=3): >>><<< 13131 1726867192.38553: stdout chunk (state=3): >>><<< 13131 1726867192.38556: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867192.3576565-13614-260611087409104=/root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867192.38597: variable 'ansible_module_compression' from source: unknown 13131 1726867192.38671: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13131 1726867192.38707: variable 'ansible_facts' from source: unknown 13131 1726867192.38875: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/AnsiballZ_stat.py 13131 1726867192.39001: Sending initial data 13131 1726867192.39005: Sent initial data (153 bytes) 13131 1726867192.39568: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.39582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.39589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867192.39604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867192.39616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867192.39642: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.39727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867192.39740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.39781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.39825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.41338: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13131 1726867192.41349: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13131 1726867192.41379: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867192.41426: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867192.41484: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpkajgju2m /root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/AnsiballZ_stat.py <<< 13131 1726867192.41491: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/AnsiballZ_stat.py" <<< 13131 1726867192.41525: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpkajgju2m" to remote "/root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/AnsiballZ_stat.py" <<< 13131 1726867192.42298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.42329: stderr chunk (state=3): >>><<< 13131 1726867192.42390: stdout chunk (state=3): >>><<< 13131 1726867192.42403: done transferring module to remote 13131 1726867192.42418: _low_level_execute_command(): starting 13131 1726867192.42426: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/ /root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/AnsiballZ_stat.py && sleep 0' 13131 1726867192.43061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.43075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.43108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867192.43170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.43224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867192.43243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.43269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.43353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.45317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.45321: stdout chunk (state=3): >>><<< 13131 1726867192.45323: stderr chunk (state=3): >>><<< 13131 1726867192.45337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867192.45365: _low_level_execute_command(): starting 13131 1726867192.45368: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/AnsiballZ_stat.py && sleep 0' 13131 1726867192.45957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.46021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.46034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.46085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867192.46103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.46153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.46242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.61288: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27758, "dev": 23, "nlink": 1, "atime": 1726867190.9684014, "mtime": 1726867190.9684014, "ctime": 1726867190.9684014, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13131 1726867192.62784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867192.62788: stderr chunk (state=3): >>><<< 13131 1726867192.62791: stdout chunk (state=3): >>><<< 13131 1726867192.62793: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27758, "dev": 23, "nlink": 1, "atime": 1726867190.9684014, "mtime": 1726867190.9684014, "ctime": 1726867190.9684014, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867192.62796: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867192.62798: _low_level_execute_command(): starting 13131 1726867192.62800: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867192.3576565-13614-260611087409104/ > /dev/null 2>&1 && sleep 0' 13131 1726867192.63417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.63425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.63438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867192.63492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.63573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.63620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.63668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.65503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.65506: stdout chunk (state=3): >>><<< 13131 1726867192.65508: stderr chunk (state=3): >>><<< 13131 1726867192.65524: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867192.65683: handler run complete 13131 1726867192.65686: attempt loop complete, returning result 13131 1726867192.65688: _execute() done 13131 1726867192.65691: dumping result to json 13131 1726867192.65692: done dumping result, returning 13131 1726867192.65694: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 [0affcac9-a3a5-5f24-9b7a-000000000248] 13131 1726867192.65697: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000248 13131 1726867192.65776: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000248 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726867190.9684014, "block_size": 4096, "blocks": 0, "ctime": 1726867190.9684014, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27758, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726867190.9684014, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13131 1726867192.65889: no more pending results, returning what we have 13131 1726867192.65893: results queue empty 13131 1726867192.65894: checking for any_errors_fatal 13131 1726867192.65895: done checking for any_errors_fatal 13131 1726867192.65896: checking for max_fail_percentage 13131 1726867192.65898: done checking for max_fail_percentage 13131 1726867192.65899: checking to see if all hosts have failed and the running result is not ok 13131 1726867192.65899: done checking to see if all hosts have failed 13131 1726867192.65900: getting the remaining hosts for this loop 13131 1726867192.65901: done getting the remaining hosts for this loop 13131 1726867192.65905: getting the next task for host managed_node1 13131 1726867192.65913: done getting next task for host managed_node1 13131 1726867192.65916: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13131 1726867192.65919: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867192.65925: getting variables 13131 1726867192.65926: in VariableManager get_vars() 13131 1726867192.66100: Calling all_inventory to load vars for managed_node1 13131 1726867192.66104: Calling groups_inventory to load vars for managed_node1 13131 1726867192.66107: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867192.66192: Calling all_plugins_play to load vars for managed_node1 13131 1726867192.66203: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867192.66208: Calling groups_plugins_play to load vars for managed_node1 13131 1726867192.66490: WORKER PROCESS EXITING 13131 1726867192.66511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867192.66727: done with get_vars() 13131 1726867192.66739: done getting variables 13131 1726867192.66845: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 13131 1726867192.66981: variable 'interface' from source: task vars 13131 1726867192.66986: variable 'dhcp_interface1' from source: play vars 13131 1726867192.67043: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:19:52 -0400 (0:00:00.374) 0:00:07.781 ****** 13131 1726867192.67085: entering _queue_task() for managed_node1/assert 13131 1726867192.67088: Creating lock for assert 13131 1726867192.67353: worker is 1 (out of 1 available) 13131 1726867192.67365: exiting _queue_task() for managed_node1/assert 13131 1726867192.67376: done queuing things up, now waiting for results queue to drain 13131 1726867192.67380: waiting for pending results... 13131 1726867192.67641: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' 13131 1726867192.67749: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000017 13131 1726867192.67765: variable 'ansible_search_path' from source: unknown 13131 1726867192.67772: variable 'ansible_search_path' from source: unknown 13131 1726867192.67816: calling self._execute() 13131 1726867192.67900: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.67920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.67932: variable 'omit' from source: magic vars 13131 1726867192.68354: variable 'ansible_distribution_major_version' from source: facts 13131 1726867192.68373: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867192.68390: variable 'omit' from source: magic vars 13131 1726867192.68440: variable 'omit' from source: magic vars 13131 1726867192.68543: variable 'interface' from source: task vars 13131 1726867192.68560: variable 'dhcp_interface1' from source: play vars 13131 1726867192.68669: variable 'dhcp_interface1' from source: play vars 13131 1726867192.68673: variable 'omit' from source: magic vars 13131 1726867192.68707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867192.68745: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867192.68768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867192.68800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867192.68823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867192.68886: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867192.68889: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.68891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.68981: Set connection var ansible_connection to ssh 13131 1726867192.69002: Set connection var ansible_timeout to 10 13131 1726867192.69027: Set connection var ansible_shell_type to sh 13131 1726867192.69030: Set connection var ansible_shell_executable to /bin/sh 13131 1726867192.69039: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867192.69050: Set connection var ansible_pipelining to False 13131 1726867192.69102: variable 'ansible_shell_executable' from source: unknown 13131 1726867192.69105: variable 'ansible_connection' from source: unknown 13131 1726867192.69107: variable 'ansible_module_compression' from source: unknown 13131 1726867192.69109: variable 'ansible_shell_type' from source: unknown 13131 1726867192.69112: variable 'ansible_shell_executable' from source: unknown 13131 1726867192.69114: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.69116: variable 'ansible_pipelining' from source: unknown 13131 1726867192.69118: variable 'ansible_timeout' from source: unknown 13131 1726867192.69137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.69282: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867192.69320: variable 'omit' from source: magic vars 13131 1726867192.69323: starting attempt loop 13131 1726867192.69326: running the handler 13131 1726867192.69463: variable 'interface_stat' from source: set_fact 13131 1726867192.69538: Evaluated conditional (interface_stat.stat.exists): True 13131 1726867192.69542: handler run complete 13131 1726867192.69544: attempt loop complete, returning result 13131 1726867192.69546: _execute() done 13131 1726867192.69549: dumping result to json 13131 1726867192.69551: done dumping result, returning 13131 1726867192.69553: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' [0affcac9-a3a5-5f24-9b7a-000000000017] 13131 1726867192.69555: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000017 13131 1726867192.69757: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000017 13131 1726867192.69760: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867192.69808: no more pending results, returning what we have 13131 1726867192.69811: results queue empty 13131 1726867192.69812: checking for any_errors_fatal 13131 1726867192.69821: done checking for any_errors_fatal 13131 1726867192.69821: checking for max_fail_percentage 13131 1726867192.69823: done checking for max_fail_percentage 13131 1726867192.69824: checking to see if all hosts have failed and the running result is not ok 13131 1726867192.69824: done checking to see if all hosts have failed 13131 1726867192.69825: getting the remaining hosts for this loop 13131 1726867192.69826: done getting the remaining hosts for this loop 13131 1726867192.69829: getting the next task for host managed_node1 13131 1726867192.69837: done getting next task for host managed_node1 13131 1726867192.69839: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13131 1726867192.69842: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867192.69845: getting variables 13131 1726867192.69846: in VariableManager get_vars() 13131 1726867192.69956: Calling all_inventory to load vars for managed_node1 13131 1726867192.69959: Calling groups_inventory to load vars for managed_node1 13131 1726867192.69962: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867192.70099: Calling all_plugins_play to load vars for managed_node1 13131 1726867192.70103: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867192.70106: Calling groups_plugins_play to load vars for managed_node1 13131 1726867192.70267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867192.70469: done with get_vars() 13131 1726867192.70480: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:19:52 -0400 (0:00:00.034) 0:00:07.816 ****** 13131 1726867192.70564: entering _queue_task() for managed_node1/include_tasks 13131 1726867192.70809: worker is 1 (out of 1 available) 13131 1726867192.70822: exiting _queue_task() for managed_node1/include_tasks 13131 1726867192.70834: done queuing things up, now waiting for results queue to drain 13131 1726867192.70835: waiting for pending results... 13131 1726867192.71091: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 13131 1726867192.71221: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000001b 13131 1726867192.71244: variable 'ansible_search_path' from source: unknown 13131 1726867192.71283: variable 'ansible_search_path' from source: unknown 13131 1726867192.71304: calling self._execute() 13131 1726867192.71393: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.71415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.71434: variable 'omit' from source: magic vars 13131 1726867192.71839: variable 'ansible_distribution_major_version' from source: facts 13131 1726867192.71843: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867192.71847: _execute() done 13131 1726867192.71849: dumping result to json 13131 1726867192.71851: done dumping result, returning 13131 1726867192.71865: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-5f24-9b7a-00000000001b] 13131 1726867192.71876: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000001b 13131 1726867192.72012: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000001b 13131 1726867192.72015: WORKER PROCESS EXITING 13131 1726867192.72044: no more pending results, returning what we have 13131 1726867192.72164: in VariableManager get_vars() 13131 1726867192.72212: Calling all_inventory to load vars for managed_node1 13131 1726867192.72215: Calling groups_inventory to load vars for managed_node1 13131 1726867192.72218: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867192.72227: Calling all_plugins_play to load vars for managed_node1 13131 1726867192.72230: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867192.72233: Calling groups_plugins_play to load vars for managed_node1 13131 1726867192.72449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867192.72678: done with get_vars() 13131 1726867192.72685: variable 'ansible_search_path' from source: unknown 13131 1726867192.72687: variable 'ansible_search_path' from source: unknown 13131 1726867192.72728: we have included files to process 13131 1726867192.72729: generating all_blocks data 13131 1726867192.72730: done generating all_blocks data 13131 1726867192.72734: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867192.72735: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867192.72737: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867192.72897: done processing included file 13131 1726867192.72898: iterating over new_blocks loaded from include file 13131 1726867192.72900: in VariableManager get_vars() 13131 1726867192.72929: done with get_vars() 13131 1726867192.72930: filtering new block on tags 13131 1726867192.72944: done filtering new block on tags 13131 1726867192.72946: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 13131 1726867192.72950: extending task lists for all hosts with included blocks 13131 1726867192.73043: done extending task lists 13131 1726867192.73044: done processing included files 13131 1726867192.73045: results queue empty 13131 1726867192.73045: checking for any_errors_fatal 13131 1726867192.73049: done checking for any_errors_fatal 13131 1726867192.73050: checking for max_fail_percentage 13131 1726867192.73051: done checking for max_fail_percentage 13131 1726867192.73052: checking to see if all hosts have failed and the running result is not ok 13131 1726867192.73053: done checking to see if all hosts have failed 13131 1726867192.73053: getting the remaining hosts for this loop 13131 1726867192.73054: done getting the remaining hosts for this loop 13131 1726867192.73056: getting the next task for host managed_node1 13131 1726867192.73060: done getting next task for host managed_node1 13131 1726867192.73061: ^ task is: TASK: Get stat for interface {{ interface }} 13131 1726867192.73064: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867192.73066: getting variables 13131 1726867192.73067: in VariableManager get_vars() 13131 1726867192.73084: Calling all_inventory to load vars for managed_node1 13131 1726867192.73086: Calling groups_inventory to load vars for managed_node1 13131 1726867192.73088: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867192.73092: Calling all_plugins_play to load vars for managed_node1 13131 1726867192.73094: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867192.73097: Calling groups_plugins_play to load vars for managed_node1 13131 1726867192.73234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867192.73426: done with get_vars() 13131 1726867192.73435: done getting variables 13131 1726867192.73588: variable 'interface' from source: task vars 13131 1726867192.73592: variable 'dhcp_interface2' from source: play vars 13131 1726867192.73647: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:19:52 -0400 (0:00:00.031) 0:00:07.847 ****** 13131 1726867192.73676: entering _queue_task() for managed_node1/stat 13131 1726867192.74082: worker is 1 (out of 1 available) 13131 1726867192.74090: exiting _queue_task() for managed_node1/stat 13131 1726867192.74099: done queuing things up, now waiting for results queue to drain 13131 1726867192.74100: waiting for pending results... 13131 1726867192.74293: running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 13131 1726867192.74298: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000260 13131 1726867192.74305: variable 'ansible_search_path' from source: unknown 13131 1726867192.74313: variable 'ansible_search_path' from source: unknown 13131 1726867192.74359: calling self._execute() 13131 1726867192.74448: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.74460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.74474: variable 'omit' from source: magic vars 13131 1726867192.74824: variable 'ansible_distribution_major_version' from source: facts 13131 1726867192.74842: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867192.74853: variable 'omit' from source: magic vars 13131 1726867192.74985: variable 'omit' from source: magic vars 13131 1726867192.75012: variable 'interface' from source: task vars 13131 1726867192.75022: variable 'dhcp_interface2' from source: play vars 13131 1726867192.75095: variable 'dhcp_interface2' from source: play vars 13131 1726867192.75118: variable 'omit' from source: magic vars 13131 1726867192.75160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867192.75209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867192.75231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867192.75254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867192.75269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867192.75312: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867192.75321: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.75329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.75483: Set connection var ansible_connection to ssh 13131 1726867192.75487: Set connection var ansible_timeout to 10 13131 1726867192.75489: Set connection var ansible_shell_type to sh 13131 1726867192.75491: Set connection var ansible_shell_executable to /bin/sh 13131 1726867192.75493: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867192.75495: Set connection var ansible_pipelining to False 13131 1726867192.75582: variable 'ansible_shell_executable' from source: unknown 13131 1726867192.75592: variable 'ansible_connection' from source: unknown 13131 1726867192.75599: variable 'ansible_module_compression' from source: unknown 13131 1726867192.75605: variable 'ansible_shell_type' from source: unknown 13131 1726867192.75610: variable 'ansible_shell_executable' from source: unknown 13131 1726867192.75615: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867192.75738: variable 'ansible_pipelining' from source: unknown 13131 1726867192.75744: variable 'ansible_timeout' from source: unknown 13131 1726867192.75746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867192.75866: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867192.75885: variable 'omit' from source: magic vars 13131 1726867192.75895: starting attempt loop 13131 1726867192.75903: running the handler 13131 1726867192.75921: _low_level_execute_command(): starting 13131 1726867192.75934: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867192.76700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.76722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.76730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867192.76737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867192.76830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867192.76833: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867192.76836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.76838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867192.76840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867192.76842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867192.76933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.76938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.76998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.78648: stdout chunk (state=3): >>>/root <<< 13131 1726867192.78861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.78864: stdout chunk (state=3): >>><<< 13131 1726867192.78866: stderr chunk (state=3): >>><<< 13131 1726867192.78869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867192.78872: _low_level_execute_command(): starting 13131 1726867192.78875: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919 `" && echo ansible-tmp-1726867192.7882211-13639-49786339206919="` echo /root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919 `" ) && sleep 0' 13131 1726867192.79432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.79443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.79446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867192.79448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867192.79451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867192.79453: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867192.79458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.79473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867192.79506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867192.79509: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867192.79511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.79514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867192.79582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867192.79585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867192.79588: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867192.79590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.79612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867192.79624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.79631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.79706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.81568: stdout chunk (state=3): >>>ansible-tmp-1726867192.7882211-13639-49786339206919=/root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919 <<< 13131 1726867192.81724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.81727: stdout chunk (state=3): >>><<< 13131 1726867192.81729: stderr chunk (state=3): >>><<< 13131 1726867192.81882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867192.7882211-13639-49786339206919=/root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867192.81885: variable 'ansible_module_compression' from source: unknown 13131 1726867192.81887: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13131 1726867192.81889: variable 'ansible_facts' from source: unknown 13131 1726867192.81975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/AnsiballZ_stat.py 13131 1726867192.82138: Sending initial data 13131 1726867192.82234: Sent initial data (152 bytes) 13131 1726867192.82829: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.82845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.82894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867192.82997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867192.83000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867192.83026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.83042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.83270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.84664: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13131 1726867192.84675: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13131 1726867192.84693: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13131 1726867192.84711: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867192.84772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867192.84847: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp8p1abw9h /root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/AnsiballZ_stat.py <<< 13131 1726867192.84871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/AnsiballZ_stat.py" <<< 13131 1726867192.84902: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp8p1abw9h" to remote "/root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/AnsiballZ_stat.py" <<< 13131 1726867192.85756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.85760: stdout chunk (state=3): >>><<< 13131 1726867192.85762: stderr chunk (state=3): >>><<< 13131 1726867192.85772: done transferring module to remote 13131 1726867192.85786: _low_level_execute_command(): starting 13131 1726867192.85796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/ /root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/AnsiballZ_stat.py && sleep 0' 13131 1726867192.86497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.86554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867192.86574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.86598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.86685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867192.88462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867192.88466: stdout chunk (state=3): >>><<< 13131 1726867192.88468: stderr chunk (state=3): >>><<< 13131 1726867192.88486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867192.88564: _low_level_execute_command(): starting 13131 1726867192.88567: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/AnsiballZ_stat.py && sleep 0' 13131 1726867192.89081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867192.89099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867192.89116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867192.89200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867192.89247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867192.89263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867192.89338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867193.04410: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28164, "dev": 23, "nlink": 1, "atime": 1726867190.9757478, "mtime": 1726867190.9757478, "ctime": 1726867190.9757478, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13131 1726867193.05712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867193.05724: stdout chunk (state=3): >>><<< 13131 1726867193.05747: stderr chunk (state=3): >>><<< 13131 1726867193.05772: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28164, "dev": 23, "nlink": 1, "atime": 1726867190.9757478, "mtime": 1726867190.9757478, "ctime": 1726867190.9757478, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867193.05846: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867193.05860: _low_level_execute_command(): starting 13131 1726867193.05869: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867192.7882211-13639-49786339206919/ > /dev/null 2>&1 && sleep 0' 13131 1726867193.06496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867193.06512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867193.06532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867193.06593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867193.06597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867193.06663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.06697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867193.06715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867193.06737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867193.06845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867193.08702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867193.08705: stdout chunk (state=3): >>><<< 13131 1726867193.08708: stderr chunk (state=3): >>><<< 13131 1726867193.08889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867193.08893: handler run complete 13131 1726867193.08895: attempt loop complete, returning result 13131 1726867193.08898: _execute() done 13131 1726867193.08900: dumping result to json 13131 1726867193.08902: done dumping result, returning 13131 1726867193.08904: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 [0affcac9-a3a5-5f24-9b7a-000000000260] 13131 1726867193.08907: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000260 13131 1726867193.08983: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000260 13131 1726867193.08986: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726867190.9757478, "block_size": 4096, "blocks": 0, "ctime": 1726867190.9757478, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28164, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726867190.9757478, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13131 1726867193.09096: no more pending results, returning what we have 13131 1726867193.09108: results queue empty 13131 1726867193.09110: checking for any_errors_fatal 13131 1726867193.09112: done checking for any_errors_fatal 13131 1726867193.09113: checking for max_fail_percentage 13131 1726867193.09115: done checking for max_fail_percentage 13131 1726867193.09116: checking to see if all hosts have failed and the running result is not ok 13131 1726867193.09116: done checking to see if all hosts have failed 13131 1726867193.09117: getting the remaining hosts for this loop 13131 1726867193.09118: done getting the remaining hosts for this loop 13131 1726867193.09122: getting the next task for host managed_node1 13131 1726867193.09131: done getting next task for host managed_node1 13131 1726867193.09133: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13131 1726867193.09137: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867193.09141: getting variables 13131 1726867193.09142: in VariableManager get_vars() 13131 1726867193.09347: Calling all_inventory to load vars for managed_node1 13131 1726867193.09350: Calling groups_inventory to load vars for managed_node1 13131 1726867193.09353: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.09365: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.09368: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.09371: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.09904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.10128: done with get_vars() 13131 1726867193.10139: done getting variables 13131 1726867193.10203: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867193.10324: variable 'interface' from source: task vars 13131 1726867193.10333: variable 'dhcp_interface2' from source: play vars 13131 1726867193.10397: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:19:53 -0400 (0:00:00.367) 0:00:08.214 ****** 13131 1726867193.10427: entering _queue_task() for managed_node1/assert 13131 1726867193.10746: worker is 1 (out of 1 available) 13131 1726867193.10757: exiting _queue_task() for managed_node1/assert 13131 1726867193.10836: done queuing things up, now waiting for results queue to drain 13131 1726867193.10838: waiting for pending results... 13131 1726867193.11072: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' 13131 1726867193.11166: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000001c 13131 1726867193.11173: variable 'ansible_search_path' from source: unknown 13131 1726867193.11175: variable 'ansible_search_path' from source: unknown 13131 1726867193.11184: calling self._execute() 13131 1726867193.11383: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.11389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.11391: variable 'omit' from source: magic vars 13131 1726867193.11683: variable 'ansible_distribution_major_version' from source: facts 13131 1726867193.11700: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867193.11723: variable 'omit' from source: magic vars 13131 1726867193.11769: variable 'omit' from source: magic vars 13131 1726867193.11884: variable 'interface' from source: task vars 13131 1726867193.11897: variable 'dhcp_interface2' from source: play vars 13131 1726867193.11976: variable 'dhcp_interface2' from source: play vars 13131 1726867193.12002: variable 'omit' from source: magic vars 13131 1726867193.12052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867193.12090: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867193.12184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867193.12188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867193.12190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867193.12370: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867193.12375: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.12379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.12516: Set connection var ansible_connection to ssh 13131 1726867193.12531: Set connection var ansible_timeout to 10 13131 1726867193.12698: Set connection var ansible_shell_type to sh 13131 1726867193.12702: Set connection var ansible_shell_executable to /bin/sh 13131 1726867193.12704: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867193.12707: Set connection var ansible_pipelining to False 13131 1726867193.12709: variable 'ansible_shell_executable' from source: unknown 13131 1726867193.12711: variable 'ansible_connection' from source: unknown 13131 1726867193.12713: variable 'ansible_module_compression' from source: unknown 13131 1726867193.12715: variable 'ansible_shell_type' from source: unknown 13131 1726867193.12716: variable 'ansible_shell_executable' from source: unknown 13131 1726867193.12724: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.12725: variable 'ansible_pipelining' from source: unknown 13131 1726867193.12728: variable 'ansible_timeout' from source: unknown 13131 1726867193.12730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.13017: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867193.13261: variable 'omit' from source: magic vars 13131 1726867193.13263: starting attempt loop 13131 1726867193.13265: running the handler 13131 1726867193.13405: variable 'interface_stat' from source: set_fact 13131 1726867193.13493: Evaluated conditional (interface_stat.stat.exists): True 13131 1726867193.13585: handler run complete 13131 1726867193.13684: attempt loop complete, returning result 13131 1726867193.13689: _execute() done 13131 1726867193.13692: dumping result to json 13131 1726867193.13694: done dumping result, returning 13131 1726867193.13697: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' [0affcac9-a3a5-5f24-9b7a-00000000001c] 13131 1726867193.13699: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000001c 13131 1726867193.13763: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000001c 13131 1726867193.13766: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867193.13822: no more pending results, returning what we have 13131 1726867193.13825: results queue empty 13131 1726867193.13826: checking for any_errors_fatal 13131 1726867193.13835: done checking for any_errors_fatal 13131 1726867193.13836: checking for max_fail_percentage 13131 1726867193.13837: done checking for max_fail_percentage 13131 1726867193.13838: checking to see if all hosts have failed and the running result is not ok 13131 1726867193.13839: done checking to see if all hosts have failed 13131 1726867193.13840: getting the remaining hosts for this loop 13131 1726867193.13841: done getting the remaining hosts for this loop 13131 1726867193.13844: getting the next task for host managed_node1 13131 1726867193.13852: done getting next task for host managed_node1 13131 1726867193.13855: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 13131 1726867193.13857: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867193.13860: getting variables 13131 1726867193.13862: in VariableManager get_vars() 13131 1726867193.13918: Calling all_inventory to load vars for managed_node1 13131 1726867193.13921: Calling groups_inventory to load vars for managed_node1 13131 1726867193.13924: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.13936: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.13939: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.13942: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.14591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.15095: done with get_vars() 13131 1726867193.15107: done getting variables 13131 1726867193.15163: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Friday 20 September 2024 17:19:53 -0400 (0:00:00.048) 0:00:08.263 ****** 13131 1726867193.15308: entering _queue_task() for managed_node1/command 13131 1726867193.15851: worker is 1 (out of 1 available) 13131 1726867193.15861: exiting _queue_task() for managed_node1/command 13131 1726867193.15872: done queuing things up, now waiting for results queue to drain 13131 1726867193.15873: waiting for pending results... 13131 1726867193.16285: running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript 13131 1726867193.16394: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000001d 13131 1726867193.16422: variable 'ansible_search_path' from source: unknown 13131 1726867193.16561: calling self._execute() 13131 1726867193.16679: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.16744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.16758: variable 'omit' from source: magic vars 13131 1726867193.17176: variable 'ansible_distribution_major_version' from source: facts 13131 1726867193.17197: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867193.17306: variable 'network_provider' from source: set_fact 13131 1726867193.17317: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867193.17324: when evaluation is False, skipping this task 13131 1726867193.17382: _execute() done 13131 1726867193.17385: dumping result to json 13131 1726867193.17396: done dumping result, returning 13131 1726867193.17399: done running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript [0affcac9-a3a5-5f24-9b7a-00000000001d] 13131 1726867193.17401: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000001d 13131 1726867193.17583: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000001d 13131 1726867193.17586: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867193.17619: no more pending results, returning what we have 13131 1726867193.17622: results queue empty 13131 1726867193.17622: checking for any_errors_fatal 13131 1726867193.17627: done checking for any_errors_fatal 13131 1726867193.17628: checking for max_fail_percentage 13131 1726867193.17630: done checking for max_fail_percentage 13131 1726867193.17630: checking to see if all hosts have failed and the running result is not ok 13131 1726867193.17631: done checking to see if all hosts have failed 13131 1726867193.17632: getting the remaining hosts for this loop 13131 1726867193.17633: done getting the remaining hosts for this loop 13131 1726867193.17636: getting the next task for host managed_node1 13131 1726867193.17641: done getting next task for host managed_node1 13131 1726867193.17643: ^ task is: TASK: TEST Add Bond with 2 ports 13131 1726867193.17646: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867193.17648: getting variables 13131 1726867193.17650: in VariableManager get_vars() 13131 1726867193.17698: Calling all_inventory to load vars for managed_node1 13131 1726867193.17700: Calling groups_inventory to load vars for managed_node1 13131 1726867193.17703: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.17711: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.17714: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.17717: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.17933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.18129: done with get_vars() 13131 1726867193.18137: done getting variables 13131 1726867193.18191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Friday 20 September 2024 17:19:53 -0400 (0:00:00.029) 0:00:08.292 ****** 13131 1726867193.18217: entering _queue_task() for managed_node1/debug 13131 1726867193.18560: worker is 1 (out of 1 available) 13131 1726867193.18570: exiting _queue_task() for managed_node1/debug 13131 1726867193.18585: done queuing things up, now waiting for results queue to drain 13131 1726867193.18586: waiting for pending results... 13131 1726867193.18727: running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports 13131 1726867193.18833: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000001e 13131 1726867193.18855: variable 'ansible_search_path' from source: unknown 13131 1726867193.18902: calling self._execute() 13131 1726867193.19005: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.19030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.19082: variable 'omit' from source: magic vars 13131 1726867193.19425: variable 'ansible_distribution_major_version' from source: facts 13131 1726867193.19442: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867193.19462: variable 'omit' from source: magic vars 13131 1726867193.19488: variable 'omit' from source: magic vars 13131 1726867193.19534: variable 'omit' from source: magic vars 13131 1726867193.19583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867193.19644: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867193.19647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867193.19670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867193.19695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867193.19753: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867193.19756: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.19759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.19867: Set connection var ansible_connection to ssh 13131 1726867193.19972: Set connection var ansible_timeout to 10 13131 1726867193.19976: Set connection var ansible_shell_type to sh 13131 1726867193.19982: Set connection var ansible_shell_executable to /bin/sh 13131 1726867193.19984: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867193.19988: Set connection var ansible_pipelining to False 13131 1726867193.19990: variable 'ansible_shell_executable' from source: unknown 13131 1726867193.19992: variable 'ansible_connection' from source: unknown 13131 1726867193.19994: variable 'ansible_module_compression' from source: unknown 13131 1726867193.19997: variable 'ansible_shell_type' from source: unknown 13131 1726867193.20000: variable 'ansible_shell_executable' from source: unknown 13131 1726867193.20001: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.20004: variable 'ansible_pipelining' from source: unknown 13131 1726867193.20006: variable 'ansible_timeout' from source: unknown 13131 1726867193.20008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.20192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867193.20196: variable 'omit' from source: magic vars 13131 1726867193.20199: starting attempt loop 13131 1726867193.20201: running the handler 13131 1726867193.20283: handler run complete 13131 1726867193.20287: attempt loop complete, returning result 13131 1726867193.20289: _execute() done 13131 1726867193.20292: dumping result to json 13131 1726867193.20299: done dumping result, returning 13131 1726867193.20302: done running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports [0affcac9-a3a5-5f24-9b7a-00000000001e] 13131 1726867193.20309: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000001e ok: [managed_node1] => {} MSG: ################################################## 13131 1726867193.20573: no more pending results, returning what we have 13131 1726867193.20576: results queue empty 13131 1726867193.20583: checking for any_errors_fatal 13131 1726867193.20591: done checking for any_errors_fatal 13131 1726867193.20592: checking for max_fail_percentage 13131 1726867193.20594: done checking for max_fail_percentage 13131 1726867193.20594: checking to see if all hosts have failed and the running result is not ok 13131 1726867193.20595: done checking to see if all hosts have failed 13131 1726867193.20596: getting the remaining hosts for this loop 13131 1726867193.20597: done getting the remaining hosts for this loop 13131 1726867193.20600: getting the next task for host managed_node1 13131 1726867193.20607: done getting next task for host managed_node1 13131 1726867193.20613: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867193.20617: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867193.20699: getting variables 13131 1726867193.20701: in VariableManager get_vars() 13131 1726867193.20831: Calling all_inventory to load vars for managed_node1 13131 1726867193.20834: Calling groups_inventory to load vars for managed_node1 13131 1726867193.20842: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.20848: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000001e 13131 1726867193.20850: WORKER PROCESS EXITING 13131 1726867193.20859: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.20861: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.20864: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.21045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.21523: done with get_vars() 13131 1726867193.21533: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:19:53 -0400 (0:00:00.034) 0:00:08.326 ****** 13131 1726867193.21629: entering _queue_task() for managed_node1/include_tasks 13131 1726867193.21859: worker is 1 (out of 1 available) 13131 1726867193.21871: exiting _queue_task() for managed_node1/include_tasks 13131 1726867193.22089: done queuing things up, now waiting for results queue to drain 13131 1726867193.22091: waiting for pending results... 13131 1726867193.22126: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867193.22259: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000026 13131 1726867193.22303: variable 'ansible_search_path' from source: unknown 13131 1726867193.22309: variable 'ansible_search_path' from source: unknown 13131 1726867193.22329: calling self._execute() 13131 1726867193.22412: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.22483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.22486: variable 'omit' from source: magic vars 13131 1726867193.22819: variable 'ansible_distribution_major_version' from source: facts 13131 1726867193.22837: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867193.22858: _execute() done 13131 1726867193.22867: dumping result to json 13131 1726867193.22875: done dumping result, returning 13131 1726867193.22889: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-5f24-9b7a-000000000026] 13131 1726867193.22900: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000026 13131 1726867193.23144: no more pending results, returning what we have 13131 1726867193.23149: in VariableManager get_vars() 13131 1726867193.23211: Calling all_inventory to load vars for managed_node1 13131 1726867193.23216: Calling groups_inventory to load vars for managed_node1 13131 1726867193.23219: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.23230: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.23234: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.23238: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.23492: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000026 13131 1726867193.23502: WORKER PROCESS EXITING 13131 1726867193.23524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.23728: done with get_vars() 13131 1726867193.23735: variable 'ansible_search_path' from source: unknown 13131 1726867193.23736: variable 'ansible_search_path' from source: unknown 13131 1726867193.23767: we have included files to process 13131 1726867193.23768: generating all_blocks data 13131 1726867193.23769: done generating all_blocks data 13131 1726867193.23775: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867193.23776: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867193.23780: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867193.24513: done processing included file 13131 1726867193.24515: iterating over new_blocks loaded from include file 13131 1726867193.24516: in VariableManager get_vars() 13131 1726867193.24545: done with get_vars() 13131 1726867193.24547: filtering new block on tags 13131 1726867193.24563: done filtering new block on tags 13131 1726867193.24566: in VariableManager get_vars() 13131 1726867193.24603: done with get_vars() 13131 1726867193.24605: filtering new block on tags 13131 1726867193.24625: done filtering new block on tags 13131 1726867193.24627: in VariableManager get_vars() 13131 1726867193.24655: done with get_vars() 13131 1726867193.24656: filtering new block on tags 13131 1726867193.24674: done filtering new block on tags 13131 1726867193.24676: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 13131 1726867193.24683: extending task lists for all hosts with included blocks 13131 1726867193.25564: done extending task lists 13131 1726867193.25566: done processing included files 13131 1726867193.25567: results queue empty 13131 1726867193.25575: checking for any_errors_fatal 13131 1726867193.25580: done checking for any_errors_fatal 13131 1726867193.25581: checking for max_fail_percentage 13131 1726867193.25582: done checking for max_fail_percentage 13131 1726867193.25583: checking to see if all hosts have failed and the running result is not ok 13131 1726867193.25584: done checking to see if all hosts have failed 13131 1726867193.25585: getting the remaining hosts for this loop 13131 1726867193.25586: done getting the remaining hosts for this loop 13131 1726867193.25588: getting the next task for host managed_node1 13131 1726867193.25592: done getting next task for host managed_node1 13131 1726867193.25595: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867193.25598: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867193.25607: getting variables 13131 1726867193.25608: in VariableManager get_vars() 13131 1726867193.25628: Calling all_inventory to load vars for managed_node1 13131 1726867193.25630: Calling groups_inventory to load vars for managed_node1 13131 1726867193.25632: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.25637: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.25640: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.25643: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.25814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.26018: done with get_vars() 13131 1726867193.26027: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:19:53 -0400 (0:00:00.044) 0:00:08.371 ****** 13131 1726867193.26100: entering _queue_task() for managed_node1/setup 13131 1726867193.26461: worker is 1 (out of 1 available) 13131 1726867193.26473: exiting _queue_task() for managed_node1/setup 13131 1726867193.26487: done queuing things up, now waiting for results queue to drain 13131 1726867193.26489: waiting for pending results... 13131 1726867193.26705: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867193.26863: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000027e 13131 1726867193.26984: variable 'ansible_search_path' from source: unknown 13131 1726867193.26990: variable 'ansible_search_path' from source: unknown 13131 1726867193.26993: calling self._execute() 13131 1726867193.27038: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.27081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.27084: variable 'omit' from source: magic vars 13131 1726867193.27420: variable 'ansible_distribution_major_version' from source: facts 13131 1726867193.27443: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867193.27643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867193.29802: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867193.29892: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867193.30082: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867193.30086: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867193.30088: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867193.30091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867193.30119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867193.30150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867193.30198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867193.30229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867193.30288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867193.30327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867193.30357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867193.30403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867193.30433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867193.30600: variable '__network_required_facts' from source: role '' defaults 13131 1726867193.30614: variable 'ansible_facts' from source: unknown 13131 1726867193.30717: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13131 1726867193.30726: when evaluation is False, skipping this task 13131 1726867193.30734: _execute() done 13131 1726867193.30758: dumping result to json 13131 1726867193.30761: done dumping result, returning 13131 1726867193.30783: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-5f24-9b7a-00000000027e] 13131 1726867193.30786: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000027e 13131 1726867193.31013: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000027e 13131 1726867193.31016: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867193.31062: no more pending results, returning what we have 13131 1726867193.31066: results queue empty 13131 1726867193.31067: checking for any_errors_fatal 13131 1726867193.31068: done checking for any_errors_fatal 13131 1726867193.31069: checking for max_fail_percentage 13131 1726867193.31071: done checking for max_fail_percentage 13131 1726867193.31071: checking to see if all hosts have failed and the running result is not ok 13131 1726867193.31072: done checking to see if all hosts have failed 13131 1726867193.31073: getting the remaining hosts for this loop 13131 1726867193.31074: done getting the remaining hosts for this loop 13131 1726867193.31080: getting the next task for host managed_node1 13131 1726867193.31097: done getting next task for host managed_node1 13131 1726867193.31101: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867193.31105: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867193.31119: getting variables 13131 1726867193.31122: in VariableManager get_vars() 13131 1726867193.31313: Calling all_inventory to load vars for managed_node1 13131 1726867193.31316: Calling groups_inventory to load vars for managed_node1 13131 1726867193.31319: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.31328: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.31331: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.31334: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.31587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.31816: done with get_vars() 13131 1726867193.31827: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:19:53 -0400 (0:00:00.058) 0:00:08.429 ****** 13131 1726867193.31938: entering _queue_task() for managed_node1/stat 13131 1726867193.32242: worker is 1 (out of 1 available) 13131 1726867193.32252: exiting _queue_task() for managed_node1/stat 13131 1726867193.32262: done queuing things up, now waiting for results queue to drain 13131 1726867193.32263: waiting for pending results... 13131 1726867193.32518: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867193.32620: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000280 13131 1726867193.32637: variable 'ansible_search_path' from source: unknown 13131 1726867193.32643: variable 'ansible_search_path' from source: unknown 13131 1726867193.32685: calling self._execute() 13131 1726867193.32765: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.32784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.32795: variable 'omit' from source: magic vars 13131 1726867193.33211: variable 'ansible_distribution_major_version' from source: facts 13131 1726867193.33214: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867193.33361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867193.33710: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867193.33763: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867193.33802: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867193.33843: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867193.33969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867193.33973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867193.33996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867193.34031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867193.34119: variable '__network_is_ostree' from source: set_fact 13131 1726867193.34184: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867193.34188: when evaluation is False, skipping this task 13131 1726867193.34191: _execute() done 13131 1726867193.34193: dumping result to json 13131 1726867193.34195: done dumping result, returning 13131 1726867193.34198: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-5f24-9b7a-000000000280] 13131 1726867193.34200: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000280 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867193.34466: no more pending results, returning what we have 13131 1726867193.34469: results queue empty 13131 1726867193.34470: checking for any_errors_fatal 13131 1726867193.34474: done checking for any_errors_fatal 13131 1726867193.34475: checking for max_fail_percentage 13131 1726867193.34478: done checking for max_fail_percentage 13131 1726867193.34479: checking to see if all hosts have failed and the running result is not ok 13131 1726867193.34480: done checking to see if all hosts have failed 13131 1726867193.34480: getting the remaining hosts for this loop 13131 1726867193.34482: done getting the remaining hosts for this loop 13131 1726867193.34485: getting the next task for host managed_node1 13131 1726867193.34491: done getting next task for host managed_node1 13131 1726867193.34495: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867193.34499: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867193.34513: getting variables 13131 1726867193.34514: in VariableManager get_vars() 13131 1726867193.34558: Calling all_inventory to load vars for managed_node1 13131 1726867193.34561: Calling groups_inventory to load vars for managed_node1 13131 1726867193.34563: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.34572: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.34575: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.34655: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.34667: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000280 13131 1726867193.34670: WORKER PROCESS EXITING 13131 1726867193.34887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.35090: done with get_vars() 13131 1726867193.35100: done getting variables 13131 1726867193.35163: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:19:53 -0400 (0:00:00.032) 0:00:08.462 ****** 13131 1726867193.35199: entering _queue_task() for managed_node1/set_fact 13131 1726867193.35438: worker is 1 (out of 1 available) 13131 1726867193.35566: exiting _queue_task() for managed_node1/set_fact 13131 1726867193.35576: done queuing things up, now waiting for results queue to drain 13131 1726867193.35579: waiting for pending results... 13131 1726867193.35732: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867193.35891: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000281 13131 1726867193.36009: variable 'ansible_search_path' from source: unknown 13131 1726867193.36013: variable 'ansible_search_path' from source: unknown 13131 1726867193.36017: calling self._execute() 13131 1726867193.36057: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.36070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.36087: variable 'omit' from source: magic vars 13131 1726867193.36479: variable 'ansible_distribution_major_version' from source: facts 13131 1726867193.36497: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867193.36663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867193.36985: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867193.37003: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867193.37043: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867193.37082: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867193.37176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867193.37283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867193.37287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867193.37290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867193.37373: variable '__network_is_ostree' from source: set_fact 13131 1726867193.37386: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867193.37393: when evaluation is False, skipping this task 13131 1726867193.37403: _execute() done 13131 1726867193.37409: dumping result to json 13131 1726867193.37419: done dumping result, returning 13131 1726867193.37428: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-5f24-9b7a-000000000281] 13131 1726867193.37513: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000281 13131 1726867193.37572: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000281 13131 1726867193.37574: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867193.37659: no more pending results, returning what we have 13131 1726867193.37661: results queue empty 13131 1726867193.37662: checking for any_errors_fatal 13131 1726867193.37667: done checking for any_errors_fatal 13131 1726867193.37668: checking for max_fail_percentage 13131 1726867193.37669: done checking for max_fail_percentage 13131 1726867193.37670: checking to see if all hosts have failed and the running result is not ok 13131 1726867193.37671: done checking to see if all hosts have failed 13131 1726867193.37672: getting the remaining hosts for this loop 13131 1726867193.37673: done getting the remaining hosts for this loop 13131 1726867193.37676: getting the next task for host managed_node1 13131 1726867193.37808: done getting next task for host managed_node1 13131 1726867193.37812: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867193.37816: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867193.37827: getting variables 13131 1726867193.37829: in VariableManager get_vars() 13131 1726867193.37867: Calling all_inventory to load vars for managed_node1 13131 1726867193.37870: Calling groups_inventory to load vars for managed_node1 13131 1726867193.37872: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867193.37882: Calling all_plugins_play to load vars for managed_node1 13131 1726867193.37885: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867193.37888: Calling groups_plugins_play to load vars for managed_node1 13131 1726867193.38045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867193.38266: done with get_vars() 13131 1726867193.38275: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:19:53 -0400 (0:00:00.031) 0:00:08.494 ****** 13131 1726867193.38371: entering _queue_task() for managed_node1/service_facts 13131 1726867193.38373: Creating lock for service_facts 13131 1726867193.38607: worker is 1 (out of 1 available) 13131 1726867193.38619: exiting _queue_task() for managed_node1/service_facts 13131 1726867193.38745: done queuing things up, now waiting for results queue to drain 13131 1726867193.38747: waiting for pending results... 13131 1726867193.38979: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867193.39075: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000283 13131 1726867193.39081: variable 'ansible_search_path' from source: unknown 13131 1726867193.39085: variable 'ansible_search_path' from source: unknown 13131 1726867193.39106: calling self._execute() 13131 1726867193.39189: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.39204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.39214: variable 'omit' from source: magic vars 13131 1726867193.39574: variable 'ansible_distribution_major_version' from source: facts 13131 1726867193.39619: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867193.39622: variable 'omit' from source: magic vars 13131 1726867193.39687: variable 'omit' from source: magic vars 13131 1726867193.39732: variable 'omit' from source: magic vars 13131 1726867193.39776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867193.39837: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867193.39841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867193.39866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867193.39946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867193.39949: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867193.39954: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.39956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.40038: Set connection var ansible_connection to ssh 13131 1726867193.40059: Set connection var ansible_timeout to 10 13131 1726867193.40071: Set connection var ansible_shell_type to sh 13131 1726867193.40087: Set connection var ansible_shell_executable to /bin/sh 13131 1726867193.40103: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867193.40114: Set connection var ansible_pipelining to False 13131 1726867193.40137: variable 'ansible_shell_executable' from source: unknown 13131 1726867193.40163: variable 'ansible_connection' from source: unknown 13131 1726867193.40166: variable 'ansible_module_compression' from source: unknown 13131 1726867193.40168: variable 'ansible_shell_type' from source: unknown 13131 1726867193.40170: variable 'ansible_shell_executable' from source: unknown 13131 1726867193.40171: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867193.40272: variable 'ansible_pipelining' from source: unknown 13131 1726867193.40276: variable 'ansible_timeout' from source: unknown 13131 1726867193.40280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867193.40395: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867193.40416: variable 'omit' from source: magic vars 13131 1726867193.40427: starting attempt loop 13131 1726867193.40434: running the handler 13131 1726867193.40491: _low_level_execute_command(): starting 13131 1726867193.40494: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867193.41381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.41436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867193.41456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867193.41487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867193.41590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867193.43273: stdout chunk (state=3): >>>/root <<< 13131 1726867193.43573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867193.43579: stdout chunk (state=3): >>><<< 13131 1726867193.43581: stderr chunk (state=3): >>><<< 13131 1726867193.43603: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867193.43703: _low_level_execute_command(): starting 13131 1726867193.43707: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997 `" && echo ansible-tmp-1726867193.4360938-13676-264471634798997="` echo /root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997 `" ) && sleep 0' 13131 1726867193.44237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867193.44242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867193.44245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.44247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867193.44256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.44302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867193.44393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867193.44472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867193.46367: stdout chunk (state=3): >>>ansible-tmp-1726867193.4360938-13676-264471634798997=/root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997 <<< 13131 1726867193.46531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867193.46534: stdout chunk (state=3): >>><<< 13131 1726867193.46537: stderr chunk (state=3): >>><<< 13131 1726867193.46555: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867193.4360938-13676-264471634798997=/root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867193.46688: variable 'ansible_module_compression' from source: unknown 13131 1726867193.46692: ANSIBALLZ: Using lock for service_facts 13131 1726867193.46694: ANSIBALLZ: Acquiring lock 13131 1726867193.46696: ANSIBALLZ: Lock acquired: 140192900425840 13131 1726867193.46698: ANSIBALLZ: Creating module 13131 1726867193.60720: ANSIBALLZ: Writing module into payload 13131 1726867193.60818: ANSIBALLZ: Writing module 13131 1726867193.60846: ANSIBALLZ: Renaming module 13131 1726867193.60849: ANSIBALLZ: Done creating module 13131 1726867193.60868: variable 'ansible_facts' from source: unknown 13131 1726867193.61144: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/AnsiballZ_service_facts.py 13131 1726867193.61623: Sending initial data 13131 1726867193.61627: Sent initial data (162 bytes) 13131 1726867193.62801: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867193.62808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.62840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867193.62844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867193.62857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.62933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867193.62936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867193.63024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867193.63119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867193.64786: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867193.64824: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867193.64879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpn2lcje3c /root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/AnsiballZ_service_facts.py <<< 13131 1726867193.64882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/AnsiballZ_service_facts.py" <<< 13131 1726867193.64922: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpn2lcje3c" to remote "/root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/AnsiballZ_service_facts.py" <<< 13131 1726867193.66466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867193.66534: stderr chunk (state=3): >>><<< 13131 1726867193.66544: stdout chunk (state=3): >>><<< 13131 1726867193.66650: done transferring module to remote 13131 1726867193.66666: _low_level_execute_command(): starting 13131 1726867193.66747: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/ /root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/AnsiballZ_service_facts.py && sleep 0' 13131 1726867193.67309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.67329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867193.67396: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.67453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867193.67456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867193.67480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867193.67541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867193.69819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867193.69823: stdout chunk (state=3): >>><<< 13131 1726867193.69825: stderr chunk (state=3): >>><<< 13131 1726867193.69827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867193.69829: _low_level_execute_command(): starting 13131 1726867193.69830: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/AnsiballZ_service_facts.py && sleep 0' 13131 1726867193.70340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867193.70352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867193.70364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867193.70375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867193.70473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867193.70523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867195.24040: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13131 1726867195.26088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867195.26092: stdout chunk (state=3): >>><<< 13131 1726867195.26095: stderr chunk (state=3): >>><<< 13131 1726867195.26099: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867195.27795: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867195.27801: _low_level_execute_command(): starting 13131 1726867195.27803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867193.4360938-13676-264471634798997/ > /dev/null 2>&1 && sleep 0' 13131 1726867195.29002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867195.29066: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867195.29082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867195.29094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867195.29110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867195.29118: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867195.29128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867195.29144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867195.29252: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867195.29516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867195.29713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867195.31567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867195.31674: stderr chunk (state=3): >>><<< 13131 1726867195.31679: stdout chunk (state=3): >>><<< 13131 1726867195.31703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867195.31709: handler run complete 13131 1726867195.32131: variable 'ansible_facts' from source: unknown 13131 1726867195.32441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867195.33358: variable 'ansible_facts' from source: unknown 13131 1726867195.35786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867195.36182: attempt loop complete, returning result 13131 1726867195.36299: _execute() done 13131 1726867195.36302: dumping result to json 13131 1726867195.36363: done dumping result, returning 13131 1726867195.36370: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-5f24-9b7a-000000000283] 13131 1726867195.36373: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000283 13131 1726867195.37897: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000283 13131 1726867195.37900: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867195.38038: no more pending results, returning what we have 13131 1726867195.38040: results queue empty 13131 1726867195.38041: checking for any_errors_fatal 13131 1726867195.38043: done checking for any_errors_fatal 13131 1726867195.38044: checking for max_fail_percentage 13131 1726867195.38045: done checking for max_fail_percentage 13131 1726867195.38046: checking to see if all hosts have failed and the running result is not ok 13131 1726867195.38047: done checking to see if all hosts have failed 13131 1726867195.38047: getting the remaining hosts for this loop 13131 1726867195.38048: done getting the remaining hosts for this loop 13131 1726867195.38051: getting the next task for host managed_node1 13131 1726867195.38055: done getting next task for host managed_node1 13131 1726867195.38058: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867195.38062: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867195.38069: getting variables 13131 1726867195.38071: in VariableManager get_vars() 13131 1726867195.38109: Calling all_inventory to load vars for managed_node1 13131 1726867195.38112: Calling groups_inventory to load vars for managed_node1 13131 1726867195.38114: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867195.38122: Calling all_plugins_play to load vars for managed_node1 13131 1726867195.38124: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867195.38126: Calling groups_plugins_play to load vars for managed_node1 13131 1726867195.38909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867195.39969: done with get_vars() 13131 1726867195.39985: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:19:55 -0400 (0:00:02.019) 0:00:10.513 ****** 13131 1726867195.40280: entering _queue_task() for managed_node1/package_facts 13131 1726867195.40282: Creating lock for package_facts 13131 1726867195.40789: worker is 1 (out of 1 available) 13131 1726867195.40914: exiting _queue_task() for managed_node1/package_facts 13131 1726867195.40926: done queuing things up, now waiting for results queue to drain 13131 1726867195.40927: waiting for pending results... 13131 1726867195.41663: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867195.41767: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000284 13131 1726867195.41783: variable 'ansible_search_path' from source: unknown 13131 1726867195.41988: variable 'ansible_search_path' from source: unknown 13131 1726867195.42024: calling self._execute() 13131 1726867195.42108: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867195.42116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867195.42183: variable 'omit' from source: magic vars 13131 1726867195.43083: variable 'ansible_distribution_major_version' from source: facts 13131 1726867195.43089: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867195.43093: variable 'omit' from source: magic vars 13131 1726867195.43096: variable 'omit' from source: magic vars 13131 1726867195.43228: variable 'omit' from source: magic vars 13131 1726867195.43265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867195.43390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867195.43411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867195.43463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867195.43474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867195.43551: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867195.43555: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867195.43558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867195.43655: Set connection var ansible_connection to ssh 13131 1726867195.43663: Set connection var ansible_timeout to 10 13131 1726867195.43666: Set connection var ansible_shell_type to sh 13131 1726867195.43674: Set connection var ansible_shell_executable to /bin/sh 13131 1726867195.43910: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867195.43918: Set connection var ansible_pipelining to False 13131 1726867195.43939: variable 'ansible_shell_executable' from source: unknown 13131 1726867195.43942: variable 'ansible_connection' from source: unknown 13131 1726867195.43945: variable 'ansible_module_compression' from source: unknown 13131 1726867195.43947: variable 'ansible_shell_type' from source: unknown 13131 1726867195.43949: variable 'ansible_shell_executable' from source: unknown 13131 1726867195.43954: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867195.43958: variable 'ansible_pipelining' from source: unknown 13131 1726867195.43960: variable 'ansible_timeout' from source: unknown 13131 1726867195.43965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867195.44362: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867195.44371: variable 'omit' from source: magic vars 13131 1726867195.44376: starting attempt loop 13131 1726867195.44381: running the handler 13131 1726867195.44397: _low_level_execute_command(): starting 13131 1726867195.44410: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867195.45881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867195.45981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867195.46293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867195.46371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867195.48100: stdout chunk (state=3): >>>/root <<< 13131 1726867195.48191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867195.48194: stdout chunk (state=3): >>><<< 13131 1726867195.48197: stderr chunk (state=3): >>><<< 13131 1726867195.48300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867195.48304: _low_level_execute_command(): starting 13131 1726867195.48308: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247 `" && echo ansible-tmp-1726867195.4821386-13782-144596104335247="` echo /root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247 `" ) && sleep 0' 13131 1726867195.49207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867195.49220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867195.49251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867195.49288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867195.49580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867195.49716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867195.51573: stdout chunk (state=3): >>>ansible-tmp-1726867195.4821386-13782-144596104335247=/root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247 <<< 13131 1726867195.51684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867195.51731: stderr chunk (state=3): >>><<< 13131 1726867195.51734: stdout chunk (state=3): >>><<< 13131 1726867195.51820: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867195.4821386-13782-144596104335247=/root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867195.51824: variable 'ansible_module_compression' from source: unknown 13131 1726867195.51909: ANSIBALLZ: Using lock for package_facts 13131 1726867195.51916: ANSIBALLZ: Acquiring lock 13131 1726867195.51931: ANSIBALLZ: Lock acquired: 140192904620384 13131 1726867195.51953: ANSIBALLZ: Creating module 13131 1726867195.86376: ANSIBALLZ: Writing module into payload 13131 1726867195.86527: ANSIBALLZ: Writing module 13131 1726867195.86557: ANSIBALLZ: Renaming module 13131 1726867195.86570: ANSIBALLZ: Done creating module 13131 1726867195.86610: variable 'ansible_facts' from source: unknown 13131 1726867195.86807: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/AnsiballZ_package_facts.py 13131 1726867195.87010: Sending initial data 13131 1726867195.87013: Sent initial data (162 bytes) 13131 1726867195.87686: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867195.87698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867195.87911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867195.87997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867195.89636: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867195.89681: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867195.89730: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmptox8bxq6 /root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/AnsiballZ_package_facts.py <<< 13131 1726867195.89741: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/AnsiballZ_package_facts.py" <<< 13131 1726867195.89782: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmptox8bxq6" to remote "/root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/AnsiballZ_package_facts.py" <<< 13131 1726867195.92662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867195.93007: stdout chunk (state=3): >>><<< 13131 1726867195.93010: stderr chunk (state=3): >>><<< 13131 1726867195.93013: done transferring module to remote 13131 1726867195.93015: _low_level_execute_command(): starting 13131 1726867195.93017: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/ /root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/AnsiballZ_package_facts.py && sleep 0' 13131 1726867195.93998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867195.94013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867195.94024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867195.94085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867195.94294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867195.94360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867195.96168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867195.96197: stderr chunk (state=3): >>><<< 13131 1726867195.96201: stdout chunk (state=3): >>><<< 13131 1726867195.96312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867195.96315: _low_level_execute_command(): starting 13131 1726867195.96319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/AnsiballZ_package_facts.py && sleep 0' 13131 1726867195.97295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867195.97299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867195.97414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867195.97439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867195.97605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867195.97735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867196.41490: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 13131 1726867196.41509: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 13131 1726867196.41517: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 13131 1726867196.41523: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 13131 1726867196.41527: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13131 1726867196.43279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867196.43283: stdout chunk (state=3): >>><<< 13131 1726867196.43286: stderr chunk (state=3): >>><<< 13131 1726867196.43389: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867196.46654: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867196.46700: _low_level_execute_command(): starting 13131 1726867196.46772: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867195.4821386-13782-144596104335247/ > /dev/null 2>&1 && sleep 0' 13131 1726867196.47345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867196.47366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867196.47383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867196.47466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867196.47515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867196.47532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867196.47554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867196.47646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867196.49568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867196.49571: stdout chunk (state=3): >>><<< 13131 1726867196.49574: stderr chunk (state=3): >>><<< 13131 1726867196.49718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867196.49935: handler run complete 13131 1726867196.52284: variable 'ansible_facts' from source: unknown 13131 1726867196.53156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867196.57226: variable 'ansible_facts' from source: unknown 13131 1726867196.58007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867196.59461: attempt loop complete, returning result 13131 1726867196.59536: _execute() done 13131 1726867196.59540: dumping result to json 13131 1726867196.59919: done dumping result, returning 13131 1726867196.59984: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-5f24-9b7a-000000000284] 13131 1726867196.59994: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000284 13131 1726867196.78835: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000284 13131 1726867196.78840: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867196.78939: no more pending results, returning what we have 13131 1726867196.78941: results queue empty 13131 1726867196.78942: checking for any_errors_fatal 13131 1726867196.78946: done checking for any_errors_fatal 13131 1726867196.78946: checking for max_fail_percentage 13131 1726867196.78948: done checking for max_fail_percentage 13131 1726867196.78949: checking to see if all hosts have failed and the running result is not ok 13131 1726867196.78949: done checking to see if all hosts have failed 13131 1726867196.78950: getting the remaining hosts for this loop 13131 1726867196.78951: done getting the remaining hosts for this loop 13131 1726867196.78954: getting the next task for host managed_node1 13131 1726867196.78960: done getting next task for host managed_node1 13131 1726867196.78963: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867196.78966: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867196.78975: getting variables 13131 1726867196.78976: in VariableManager get_vars() 13131 1726867196.79018: Calling all_inventory to load vars for managed_node1 13131 1726867196.79021: Calling groups_inventory to load vars for managed_node1 13131 1726867196.79024: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867196.79032: Calling all_plugins_play to load vars for managed_node1 13131 1726867196.79035: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867196.79038: Calling groups_plugins_play to load vars for managed_node1 13131 1726867196.81903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867196.86301: done with get_vars() 13131 1726867196.86325: done getting variables 13131 1726867196.86784: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:19:56 -0400 (0:00:01.465) 0:00:11.978 ****** 13131 1726867196.86822: entering _queue_task() for managed_node1/debug 13131 1726867196.87525: worker is 1 (out of 1 available) 13131 1726867196.87537: exiting _queue_task() for managed_node1/debug 13131 1726867196.87548: done queuing things up, now waiting for results queue to drain 13131 1726867196.87550: waiting for pending results... 13131 1726867196.88196: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867196.88205: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000027 13131 1726867196.88209: variable 'ansible_search_path' from source: unknown 13131 1726867196.88212: variable 'ansible_search_path' from source: unknown 13131 1726867196.88583: calling self._execute() 13131 1726867196.88588: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867196.88594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867196.88597: variable 'omit' from source: magic vars 13131 1726867196.89241: variable 'ansible_distribution_major_version' from source: facts 13131 1726867196.89583: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867196.89587: variable 'omit' from source: magic vars 13131 1726867196.89589: variable 'omit' from source: magic vars 13131 1726867196.89669: variable 'network_provider' from source: set_fact 13131 1726867196.89698: variable 'omit' from source: magic vars 13131 1726867196.89744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867196.89821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867196.90185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867196.90189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867196.90191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867196.90194: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867196.90196: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867196.90198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867196.90200: Set connection var ansible_connection to ssh 13131 1726867196.90202: Set connection var ansible_timeout to 10 13131 1726867196.90204: Set connection var ansible_shell_type to sh 13131 1726867196.90206: Set connection var ansible_shell_executable to /bin/sh 13131 1726867196.90396: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867196.90408: Set connection var ansible_pipelining to False 13131 1726867196.90434: variable 'ansible_shell_executable' from source: unknown 13131 1726867196.90444: variable 'ansible_connection' from source: unknown 13131 1726867196.90451: variable 'ansible_module_compression' from source: unknown 13131 1726867196.90457: variable 'ansible_shell_type' from source: unknown 13131 1726867196.90463: variable 'ansible_shell_executable' from source: unknown 13131 1726867196.90469: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867196.90475: variable 'ansible_pipelining' from source: unknown 13131 1726867196.90484: variable 'ansible_timeout' from source: unknown 13131 1726867196.90494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867196.90835: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867196.90850: variable 'omit' from source: magic vars 13131 1726867196.90860: starting attempt loop 13131 1726867196.90866: running the handler 13131 1726867196.90920: handler run complete 13131 1726867196.90938: attempt loop complete, returning result 13131 1726867196.90946: _execute() done 13131 1726867196.90954: dumping result to json 13131 1726867196.90962: done dumping result, returning 13131 1726867196.90973: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-5f24-9b7a-000000000027] 13131 1726867196.90985: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000027 ok: [managed_node1] => {} MSG: Using network provider: nm 13131 1726867196.91134: no more pending results, returning what we have 13131 1726867196.91137: results queue empty 13131 1726867196.91138: checking for any_errors_fatal 13131 1726867196.91147: done checking for any_errors_fatal 13131 1726867196.91148: checking for max_fail_percentage 13131 1726867196.91150: done checking for max_fail_percentage 13131 1726867196.91151: checking to see if all hosts have failed and the running result is not ok 13131 1726867196.91151: done checking to see if all hosts have failed 13131 1726867196.91153: getting the remaining hosts for this loop 13131 1726867196.91154: done getting the remaining hosts for this loop 13131 1726867196.91157: getting the next task for host managed_node1 13131 1726867196.91164: done getting next task for host managed_node1 13131 1726867196.91168: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867196.91171: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867196.91183: getting variables 13131 1726867196.91185: in VariableManager get_vars() 13131 1726867196.91237: Calling all_inventory to load vars for managed_node1 13131 1726867196.91240: Calling groups_inventory to load vars for managed_node1 13131 1726867196.91242: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867196.91252: Calling all_plugins_play to load vars for managed_node1 13131 1726867196.91254: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867196.91257: Calling groups_plugins_play to load vars for managed_node1 13131 1726867196.92487: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000027 13131 1726867196.92494: WORKER PROCESS EXITING 13131 1726867196.95131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867196.99456: done with get_vars() 13131 1726867196.99884: done getting variables 13131 1726867196.99972: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:19:56 -0400 (0:00:00.131) 0:00:12.110 ****** 13131 1726867197.00007: entering _queue_task() for managed_node1/fail 13131 1726867197.00009: Creating lock for fail 13131 1726867197.01141: worker is 1 (out of 1 available) 13131 1726867197.01153: exiting _queue_task() for managed_node1/fail 13131 1726867197.01165: done queuing things up, now waiting for results queue to drain 13131 1726867197.01166: waiting for pending results... 13131 1726867197.01746: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867197.01881: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000028 13131 1726867197.01904: variable 'ansible_search_path' from source: unknown 13131 1726867197.02282: variable 'ansible_search_path' from source: unknown 13131 1726867197.02287: calling self._execute() 13131 1726867197.02289: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867197.02295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867197.02297: variable 'omit' from source: magic vars 13131 1726867197.02982: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.03003: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867197.03224: variable 'network_state' from source: role '' defaults 13131 1726867197.03241: Evaluated conditional (network_state != {}): False 13131 1726867197.03250: when evaluation is False, skipping this task 13131 1726867197.03258: _execute() done 13131 1726867197.03388: dumping result to json 13131 1726867197.03399: done dumping result, returning 13131 1726867197.03408: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-5f24-9b7a-000000000028] 13131 1726867197.03416: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000028 13131 1726867197.03517: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000028 13131 1726867197.03524: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867197.03623: no more pending results, returning what we have 13131 1726867197.03627: results queue empty 13131 1726867197.03628: checking for any_errors_fatal 13131 1726867197.03632: done checking for any_errors_fatal 13131 1726867197.03633: checking for max_fail_percentage 13131 1726867197.03634: done checking for max_fail_percentage 13131 1726867197.03635: checking to see if all hosts have failed and the running result is not ok 13131 1726867197.03636: done checking to see if all hosts have failed 13131 1726867197.03637: getting the remaining hosts for this loop 13131 1726867197.03638: done getting the remaining hosts for this loop 13131 1726867197.03641: getting the next task for host managed_node1 13131 1726867197.03648: done getting next task for host managed_node1 13131 1726867197.03651: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867197.03654: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867197.03669: getting variables 13131 1726867197.03670: in VariableManager get_vars() 13131 1726867197.03720: Calling all_inventory to load vars for managed_node1 13131 1726867197.03723: Calling groups_inventory to load vars for managed_node1 13131 1726867197.03725: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867197.03734: Calling all_plugins_play to load vars for managed_node1 13131 1726867197.03736: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867197.03738: Calling groups_plugins_play to load vars for managed_node1 13131 1726867197.06696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867197.09819: done with get_vars() 13131 1726867197.09841: done getting variables 13131 1726867197.10011: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:19:57 -0400 (0:00:00.100) 0:00:12.211 ****** 13131 1726867197.10046: entering _queue_task() for managed_node1/fail 13131 1726867197.10716: worker is 1 (out of 1 available) 13131 1726867197.10727: exiting _queue_task() for managed_node1/fail 13131 1726867197.10958: done queuing things up, now waiting for results queue to drain 13131 1726867197.10960: waiting for pending results... 13131 1726867197.11320: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867197.11554: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000029 13131 1726867197.11627: variable 'ansible_search_path' from source: unknown 13131 1726867197.11638: variable 'ansible_search_path' from source: unknown 13131 1726867197.11683: calling self._execute() 13131 1726867197.12059: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867197.12063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867197.12066: variable 'omit' from source: magic vars 13131 1726867197.12746: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.12834: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867197.13084: variable 'network_state' from source: role '' defaults 13131 1726867197.13101: Evaluated conditional (network_state != {}): False 13131 1726867197.13109: when evaluation is False, skipping this task 13131 1726867197.13117: _execute() done 13131 1726867197.13123: dumping result to json 13131 1726867197.13152: done dumping result, returning 13131 1726867197.13164: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-5f24-9b7a-000000000029] 13131 1726867197.13200: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000029 13131 1726867197.13489: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000029 13131 1726867197.13492: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867197.13729: no more pending results, returning what we have 13131 1726867197.13733: results queue empty 13131 1726867197.13734: checking for any_errors_fatal 13131 1726867197.13742: done checking for any_errors_fatal 13131 1726867197.13743: checking for max_fail_percentage 13131 1726867197.13745: done checking for max_fail_percentage 13131 1726867197.13746: checking to see if all hosts have failed and the running result is not ok 13131 1726867197.13747: done checking to see if all hosts have failed 13131 1726867197.13747: getting the remaining hosts for this loop 13131 1726867197.13749: done getting the remaining hosts for this loop 13131 1726867197.13752: getting the next task for host managed_node1 13131 1726867197.13760: done getting next task for host managed_node1 13131 1726867197.13763: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867197.13767: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867197.13784: getting variables 13131 1726867197.13788: in VariableManager get_vars() 13131 1726867197.13842: Calling all_inventory to load vars for managed_node1 13131 1726867197.13845: Calling groups_inventory to load vars for managed_node1 13131 1726867197.13848: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867197.13860: Calling all_plugins_play to load vars for managed_node1 13131 1726867197.13863: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867197.13866: Calling groups_plugins_play to load vars for managed_node1 13131 1726867197.16780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867197.20032: done with get_vars() 13131 1726867197.20055: done getting variables 13131 1726867197.20226: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:19:57 -0400 (0:00:00.102) 0:00:12.313 ****** 13131 1726867197.20258: entering _queue_task() for managed_node1/fail 13131 1726867197.21055: worker is 1 (out of 1 available) 13131 1726867197.21066: exiting _queue_task() for managed_node1/fail 13131 1726867197.21075: done queuing things up, now waiting for results queue to drain 13131 1726867197.21076: waiting for pending results... 13131 1726867197.21596: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867197.21716: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000002a 13131 1726867197.21738: variable 'ansible_search_path' from source: unknown 13131 1726867197.21934: variable 'ansible_search_path' from source: unknown 13131 1726867197.21938: calling self._execute() 13131 1726867197.21998: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867197.22053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867197.22068: variable 'omit' from source: magic vars 13131 1726867197.22804: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.23022: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867197.23246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867197.27743: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867197.27971: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867197.28081: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867197.28185: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867197.28197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867197.28404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.28433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.28465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.28554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.28640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.28858: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.28880: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13131 1726867197.29118: variable 'ansible_distribution' from source: facts 13131 1726867197.29180: variable '__network_rh_distros' from source: role '' defaults 13131 1726867197.29304: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13131 1726867197.29682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.29885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.29889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.29927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.30004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.30092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.30179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.30282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.30310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.30429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.30584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.30588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.30591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.30630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.30710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.31412: variable 'network_connections' from source: task vars 13131 1726867197.31467: variable 'controller_profile' from source: play vars 13131 1726867197.31683: variable 'controller_profile' from source: play vars 13131 1726867197.31687: variable 'controller_device' from source: play vars 13131 1726867197.31788: variable 'controller_device' from source: play vars 13131 1726867197.31805: variable 'port1_profile' from source: play vars 13131 1726867197.31998: variable 'port1_profile' from source: play vars 13131 1726867197.32001: variable 'dhcp_interface1' from source: play vars 13131 1726867197.32058: variable 'dhcp_interface1' from source: play vars 13131 1726867197.32117: variable 'controller_profile' from source: play vars 13131 1726867197.32238: variable 'controller_profile' from source: play vars 13131 1726867197.32325: variable 'port2_profile' from source: play vars 13131 1726867197.32416: variable 'port2_profile' from source: play vars 13131 1726867197.32432: variable 'dhcp_interface2' from source: play vars 13131 1726867197.32542: variable 'dhcp_interface2' from source: play vars 13131 1726867197.32602: variable 'controller_profile' from source: play vars 13131 1726867197.32760: variable 'controller_profile' from source: play vars 13131 1726867197.32763: variable 'network_state' from source: role '' defaults 13131 1726867197.32902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867197.33386: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867197.33390: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867197.33392: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867197.33394: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867197.33649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867197.33676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867197.33825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.33857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867197.33906: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13131 1726867197.33913: when evaluation is False, skipping this task 13131 1726867197.33919: _execute() done 13131 1726867197.34149: dumping result to json 13131 1726867197.34152: done dumping result, returning 13131 1726867197.34155: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-5f24-9b7a-00000000002a] 13131 1726867197.34158: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002a 13131 1726867197.34232: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002a 13131 1726867197.34235: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13131 1726867197.34307: no more pending results, returning what we have 13131 1726867197.34311: results queue empty 13131 1726867197.34313: checking for any_errors_fatal 13131 1726867197.34319: done checking for any_errors_fatal 13131 1726867197.34320: checking for max_fail_percentage 13131 1726867197.34322: done checking for max_fail_percentage 13131 1726867197.34323: checking to see if all hosts have failed and the running result is not ok 13131 1726867197.34324: done checking to see if all hosts have failed 13131 1726867197.34324: getting the remaining hosts for this loop 13131 1726867197.34326: done getting the remaining hosts for this loop 13131 1726867197.34330: getting the next task for host managed_node1 13131 1726867197.34338: done getting next task for host managed_node1 13131 1726867197.34342: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867197.34346: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867197.34366: getting variables 13131 1726867197.34368: in VariableManager get_vars() 13131 1726867197.34432: Calling all_inventory to load vars for managed_node1 13131 1726867197.34435: Calling groups_inventory to load vars for managed_node1 13131 1726867197.34439: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867197.34451: Calling all_plugins_play to load vars for managed_node1 13131 1726867197.34454: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867197.34457: Calling groups_plugins_play to load vars for managed_node1 13131 1726867197.37919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867197.39994: done with get_vars() 13131 1726867197.40022: done getting variables 13131 1726867197.40159: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:19:57 -0400 (0:00:00.199) 0:00:12.512 ****** 13131 1726867197.40198: entering _queue_task() for managed_node1/dnf 13131 1726867197.40878: worker is 1 (out of 1 available) 13131 1726867197.40890: exiting _queue_task() for managed_node1/dnf 13131 1726867197.40899: done queuing things up, now waiting for results queue to drain 13131 1726867197.40900: waiting for pending results... 13131 1726867197.41111: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867197.41310: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000002b 13131 1726867197.41507: variable 'ansible_search_path' from source: unknown 13131 1726867197.41510: variable 'ansible_search_path' from source: unknown 13131 1726867197.41514: calling self._execute() 13131 1726867197.41632: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867197.41788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867197.41792: variable 'omit' from source: magic vars 13131 1726867197.42598: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.42723: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867197.42949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867197.45485: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867197.45572: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867197.45880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867197.45884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867197.45886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867197.46050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.46183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.46233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.46338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.46401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.46747: variable 'ansible_distribution' from source: facts 13131 1726867197.46751: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.46753: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13131 1726867197.47072: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867197.47449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.47557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.47560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.47563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.47571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.47618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.47697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.47726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.47820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.47907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.47952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.48129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.48158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.48204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.48230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.48671: variable 'network_connections' from source: task vars 13131 1726867197.48675: variable 'controller_profile' from source: play vars 13131 1726867197.48869: variable 'controller_profile' from source: play vars 13131 1726867197.48872: variable 'controller_device' from source: play vars 13131 1726867197.48997: variable 'controller_device' from source: play vars 13131 1726867197.49000: variable 'port1_profile' from source: play vars 13131 1726867197.49059: variable 'port1_profile' from source: play vars 13131 1726867197.49095: variable 'dhcp_interface1' from source: play vars 13131 1726867197.49284: variable 'dhcp_interface1' from source: play vars 13131 1726867197.49288: variable 'controller_profile' from source: play vars 13131 1726867197.49430: variable 'controller_profile' from source: play vars 13131 1726867197.49443: variable 'port2_profile' from source: play vars 13131 1726867197.49592: variable 'port2_profile' from source: play vars 13131 1726867197.49609: variable 'dhcp_interface2' from source: play vars 13131 1726867197.49681: variable 'dhcp_interface2' from source: play vars 13131 1726867197.49728: variable 'controller_profile' from source: play vars 13131 1726867197.49893: variable 'controller_profile' from source: play vars 13131 1726867197.50081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867197.50425: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867197.50536: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867197.50570: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867197.50637: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867197.50759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867197.50947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867197.51040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.51044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867197.51179: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867197.51705: variable 'network_connections' from source: task vars 13131 1726867197.51716: variable 'controller_profile' from source: play vars 13131 1726867197.51786: variable 'controller_profile' from source: play vars 13131 1726867197.51849: variable 'controller_device' from source: play vars 13131 1726867197.51992: variable 'controller_device' from source: play vars 13131 1726867197.52007: variable 'port1_profile' from source: play vars 13131 1726867197.52149: variable 'port1_profile' from source: play vars 13131 1726867197.52168: variable 'dhcp_interface1' from source: play vars 13131 1726867197.52388: variable 'dhcp_interface1' from source: play vars 13131 1726867197.52391: variable 'controller_profile' from source: play vars 13131 1726867197.52603: variable 'controller_profile' from source: play vars 13131 1726867197.52606: variable 'port2_profile' from source: play vars 13131 1726867197.52608: variable 'port2_profile' from source: play vars 13131 1726867197.52611: variable 'dhcp_interface2' from source: play vars 13131 1726867197.52743: variable 'dhcp_interface2' from source: play vars 13131 1726867197.52833: variable 'controller_profile' from source: play vars 13131 1726867197.52892: variable 'controller_profile' from source: play vars 13131 1726867197.52966: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867197.53148: when evaluation is False, skipping this task 13131 1726867197.53153: _execute() done 13131 1726867197.53155: dumping result to json 13131 1726867197.53157: done dumping result, returning 13131 1726867197.53160: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-00000000002b] 13131 1726867197.53162: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002b 13131 1726867197.53232: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002b 13131 1726867197.53235: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867197.53304: no more pending results, returning what we have 13131 1726867197.53307: results queue empty 13131 1726867197.53309: checking for any_errors_fatal 13131 1726867197.53313: done checking for any_errors_fatal 13131 1726867197.53314: checking for max_fail_percentage 13131 1726867197.53316: done checking for max_fail_percentage 13131 1726867197.53317: checking to see if all hosts have failed and the running result is not ok 13131 1726867197.53318: done checking to see if all hosts have failed 13131 1726867197.53318: getting the remaining hosts for this loop 13131 1726867197.53320: done getting the remaining hosts for this loop 13131 1726867197.53323: getting the next task for host managed_node1 13131 1726867197.53330: done getting next task for host managed_node1 13131 1726867197.53334: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867197.53337: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867197.53470: getting variables 13131 1726867197.53472: in VariableManager get_vars() 13131 1726867197.53531: Calling all_inventory to load vars for managed_node1 13131 1726867197.53534: Calling groups_inventory to load vars for managed_node1 13131 1726867197.53536: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867197.53547: Calling all_plugins_play to load vars for managed_node1 13131 1726867197.53550: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867197.53553: Calling groups_plugins_play to load vars for managed_node1 13131 1726867197.56572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867197.59955: done with get_vars() 13131 1726867197.60101: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867197.60238: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:19:57 -0400 (0:00:00.200) 0:00:12.713 ****** 13131 1726867197.60270: entering _queue_task() for managed_node1/yum 13131 1726867197.60272: Creating lock for yum 13131 1726867197.61073: worker is 1 (out of 1 available) 13131 1726867197.61089: exiting _queue_task() for managed_node1/yum 13131 1726867197.61168: done queuing things up, now waiting for results queue to drain 13131 1726867197.61170: waiting for pending results... 13131 1726867197.61645: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867197.61681: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000002c 13131 1726867197.61820: variable 'ansible_search_path' from source: unknown 13131 1726867197.61824: variable 'ansible_search_path' from source: unknown 13131 1726867197.61841: calling self._execute() 13131 1726867197.61974: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867197.61980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867197.61984: variable 'omit' from source: magic vars 13131 1726867197.62841: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.62846: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867197.63073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867197.67589: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867197.67616: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867197.67650: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867197.67685: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867197.67916: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867197.67920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.67922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.67925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.67927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.67930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.67987: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.68005: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13131 1726867197.68009: when evaluation is False, skipping this task 13131 1726867197.68012: _execute() done 13131 1726867197.68022: dumping result to json 13131 1726867197.68026: done dumping result, returning 13131 1726867197.68028: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-00000000002c] 13131 1726867197.68031: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002c 13131 1726867197.68307: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002c 13131 1726867197.68310: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13131 1726867197.68352: no more pending results, returning what we have 13131 1726867197.68355: results queue empty 13131 1726867197.68355: checking for any_errors_fatal 13131 1726867197.68359: done checking for any_errors_fatal 13131 1726867197.68360: checking for max_fail_percentage 13131 1726867197.68361: done checking for max_fail_percentage 13131 1726867197.68362: checking to see if all hosts have failed and the running result is not ok 13131 1726867197.68363: done checking to see if all hosts have failed 13131 1726867197.68363: getting the remaining hosts for this loop 13131 1726867197.68364: done getting the remaining hosts for this loop 13131 1726867197.68367: getting the next task for host managed_node1 13131 1726867197.68373: done getting next task for host managed_node1 13131 1726867197.68376: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867197.68381: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867197.68394: getting variables 13131 1726867197.68396: in VariableManager get_vars() 13131 1726867197.68445: Calling all_inventory to load vars for managed_node1 13131 1726867197.68448: Calling groups_inventory to load vars for managed_node1 13131 1726867197.68450: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867197.68459: Calling all_plugins_play to load vars for managed_node1 13131 1726867197.68461: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867197.68464: Calling groups_plugins_play to load vars for managed_node1 13131 1726867197.71575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867197.74893: done with get_vars() 13131 1726867197.74920: done getting variables 13131 1726867197.74978: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:19:57 -0400 (0:00:00.148) 0:00:12.861 ****** 13131 1726867197.75131: entering _queue_task() for managed_node1/fail 13131 1726867197.75841: worker is 1 (out of 1 available) 13131 1726867197.75853: exiting _queue_task() for managed_node1/fail 13131 1726867197.75864: done queuing things up, now waiting for results queue to drain 13131 1726867197.75866: waiting for pending results... 13131 1726867197.76295: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867197.76602: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000002d 13131 1726867197.76616: variable 'ansible_search_path' from source: unknown 13131 1726867197.76619: variable 'ansible_search_path' from source: unknown 13131 1726867197.76655: calling self._execute() 13131 1726867197.76748: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867197.76755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867197.76758: variable 'omit' from source: magic vars 13131 1726867197.77783: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.77786: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867197.77789: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867197.77984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867197.80169: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867197.80255: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867197.80289: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867197.80330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867197.80355: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867197.80434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.80464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.80491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.80537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.80552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.80682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.80686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.80688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.80691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.80708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.80746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.80964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.80967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.80970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.80972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.81025: variable 'network_connections' from source: task vars 13131 1726867197.81037: variable 'controller_profile' from source: play vars 13131 1726867197.81104: variable 'controller_profile' from source: play vars 13131 1726867197.81114: variable 'controller_device' from source: play vars 13131 1726867197.81178: variable 'controller_device' from source: play vars 13131 1726867197.81183: variable 'port1_profile' from source: play vars 13131 1726867197.81395: variable 'port1_profile' from source: play vars 13131 1726867197.81399: variable 'dhcp_interface1' from source: play vars 13131 1726867197.81402: variable 'dhcp_interface1' from source: play vars 13131 1726867197.81405: variable 'controller_profile' from source: play vars 13131 1726867197.81407: variable 'controller_profile' from source: play vars 13131 1726867197.81409: variable 'port2_profile' from source: play vars 13131 1726867197.81443: variable 'port2_profile' from source: play vars 13131 1726867197.81450: variable 'dhcp_interface2' from source: play vars 13131 1726867197.81523: variable 'dhcp_interface2' from source: play vars 13131 1726867197.81526: variable 'controller_profile' from source: play vars 13131 1726867197.81724: variable 'controller_profile' from source: play vars 13131 1726867197.81727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867197.81819: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867197.81860: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867197.81891: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867197.81921: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867197.81966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867197.81989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867197.82020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.82049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867197.82264: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867197.82392: variable 'network_connections' from source: task vars 13131 1726867197.82396: variable 'controller_profile' from source: play vars 13131 1726867197.82451: variable 'controller_profile' from source: play vars 13131 1726867197.82457: variable 'controller_device' from source: play vars 13131 1726867197.82521: variable 'controller_device' from source: play vars 13131 1726867197.82530: variable 'port1_profile' from source: play vars 13131 1726867197.82610: variable 'port1_profile' from source: play vars 13131 1726867197.82613: variable 'dhcp_interface1' from source: play vars 13131 1726867197.82657: variable 'dhcp_interface1' from source: play vars 13131 1726867197.82663: variable 'controller_profile' from source: play vars 13131 1726867197.82731: variable 'controller_profile' from source: play vars 13131 1726867197.82738: variable 'port2_profile' from source: play vars 13131 1726867197.82935: variable 'port2_profile' from source: play vars 13131 1726867197.82939: variable 'dhcp_interface2' from source: play vars 13131 1726867197.82941: variable 'dhcp_interface2' from source: play vars 13131 1726867197.82943: variable 'controller_profile' from source: play vars 13131 1726867197.82945: variable 'controller_profile' from source: play vars 13131 1726867197.82972: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867197.82975: when evaluation is False, skipping this task 13131 1726867197.82980: _execute() done 13131 1726867197.82982: dumping result to json 13131 1726867197.82984: done dumping result, returning 13131 1726867197.82993: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-00000000002d] 13131 1726867197.83000: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002d skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867197.83298: no more pending results, returning what we have 13131 1726867197.83301: results queue empty 13131 1726867197.83302: checking for any_errors_fatal 13131 1726867197.83306: done checking for any_errors_fatal 13131 1726867197.83307: checking for max_fail_percentage 13131 1726867197.83308: done checking for max_fail_percentage 13131 1726867197.83309: checking to see if all hosts have failed and the running result is not ok 13131 1726867197.83310: done checking to see if all hosts have failed 13131 1726867197.83310: getting the remaining hosts for this loop 13131 1726867197.83312: done getting the remaining hosts for this loop 13131 1726867197.83315: getting the next task for host managed_node1 13131 1726867197.83320: done getting next task for host managed_node1 13131 1726867197.83323: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13131 1726867197.83326: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867197.83339: getting variables 13131 1726867197.83340: in VariableManager get_vars() 13131 1726867197.83386: Calling all_inventory to load vars for managed_node1 13131 1726867197.83389: Calling groups_inventory to load vars for managed_node1 13131 1726867197.83391: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867197.83399: Calling all_plugins_play to load vars for managed_node1 13131 1726867197.83402: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867197.83404: Calling groups_plugins_play to load vars for managed_node1 13131 1726867197.84141: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002d 13131 1726867197.84145: WORKER PROCESS EXITING 13131 1726867197.86228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867197.89584: done with get_vars() 13131 1726867197.89609: done getting variables 13131 1726867197.89787: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:19:57 -0400 (0:00:00.146) 0:00:13.008 ****** 13131 1726867197.89819: entering _queue_task() for managed_node1/package 13131 1726867197.90561: worker is 1 (out of 1 available) 13131 1726867197.90573: exiting _queue_task() for managed_node1/package 13131 1726867197.90588: done queuing things up, now waiting for results queue to drain 13131 1726867197.90589: waiting for pending results... 13131 1726867197.91047: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13131 1726867197.91278: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000002e 13131 1726867197.91340: variable 'ansible_search_path' from source: unknown 13131 1726867197.91344: variable 'ansible_search_path' from source: unknown 13131 1726867197.91402: calling self._execute() 13131 1726867197.91621: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867197.91682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867197.91686: variable 'omit' from source: magic vars 13131 1726867197.92275: variable 'ansible_distribution_major_version' from source: facts 13131 1726867197.92281: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867197.92401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867197.92683: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867197.92727: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867197.92770: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867197.92805: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867197.93041: variable 'network_packages' from source: role '' defaults 13131 1726867197.93052: variable '__network_provider_setup' from source: role '' defaults 13131 1726867197.93064: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867197.93137: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867197.93149: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867197.93265: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867197.93549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867197.96287: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867197.96337: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867197.96371: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867197.96407: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867197.96437: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867197.96511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.96542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.96567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.96612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.96626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867197.96672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867197.96698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867197.96721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867197.96762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867197.96776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.00951: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867198.01069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.01091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.01188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.01191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.01194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.01317: variable 'ansible_python' from source: facts 13131 1726867198.01436: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867198.01440: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867198.01721: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867198.01943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.01966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.02094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.02140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.02153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.02200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.02285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.02368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.02418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.02431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.02749: variable 'network_connections' from source: task vars 13131 1726867198.02753: variable 'controller_profile' from source: play vars 13131 1726867198.02857: variable 'controller_profile' from source: play vars 13131 1726867198.02866: variable 'controller_device' from source: play vars 13131 1726867198.03282: variable 'controller_device' from source: play vars 13131 1726867198.03285: variable 'port1_profile' from source: play vars 13131 1726867198.03361: variable 'port1_profile' from source: play vars 13131 1726867198.03370: variable 'dhcp_interface1' from source: play vars 13131 1726867198.03582: variable 'dhcp_interface1' from source: play vars 13131 1726867198.03591: variable 'controller_profile' from source: play vars 13131 1726867198.03825: variable 'controller_profile' from source: play vars 13131 1726867198.03836: variable 'port2_profile' from source: play vars 13131 1726867198.04052: variable 'port2_profile' from source: play vars 13131 1726867198.04060: variable 'dhcp_interface2' from source: play vars 13131 1726867198.04167: variable 'dhcp_interface2' from source: play vars 13131 1726867198.04175: variable 'controller_profile' from source: play vars 13131 1726867198.04293: variable 'controller_profile' from source: play vars 13131 1726867198.04362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867198.04395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867198.04484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.04487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867198.04493: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867198.04782: variable 'network_connections' from source: task vars 13131 1726867198.04788: variable 'controller_profile' from source: play vars 13131 1726867198.04892: variable 'controller_profile' from source: play vars 13131 1726867198.04905: variable 'controller_device' from source: play vars 13131 1726867198.05039: variable 'controller_device' from source: play vars 13131 1726867198.05045: variable 'port1_profile' from source: play vars 13131 1726867198.05148: variable 'port1_profile' from source: play vars 13131 1726867198.05154: variable 'dhcp_interface1' from source: play vars 13131 1726867198.05256: variable 'dhcp_interface1' from source: play vars 13131 1726867198.05265: variable 'controller_profile' from source: play vars 13131 1726867198.05361: variable 'controller_profile' from source: play vars 13131 1726867198.05683: variable 'port2_profile' from source: play vars 13131 1726867198.05690: variable 'port2_profile' from source: play vars 13131 1726867198.05693: variable 'dhcp_interface2' from source: play vars 13131 1726867198.05695: variable 'dhcp_interface2' from source: play vars 13131 1726867198.05697: variable 'controller_profile' from source: play vars 13131 1726867198.05699: variable 'controller_profile' from source: play vars 13131 1726867198.05764: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867198.06170: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867198.06445: variable 'network_connections' from source: task vars 13131 1726867198.06502: variable 'controller_profile' from source: play vars 13131 1726867198.06607: variable 'controller_profile' from source: play vars 13131 1726867198.06710: variable 'controller_device' from source: play vars 13131 1726867198.06936: variable 'controller_device' from source: play vars 13131 1726867198.06940: variable 'port1_profile' from source: play vars 13131 1726867198.07005: variable 'port1_profile' from source: play vars 13131 1726867198.07057: variable 'dhcp_interface1' from source: play vars 13131 1726867198.07205: variable 'dhcp_interface1' from source: play vars 13131 1726867198.07251: variable 'controller_profile' from source: play vars 13131 1726867198.07406: variable 'controller_profile' from source: play vars 13131 1726867198.07418: variable 'port2_profile' from source: play vars 13131 1726867198.07502: variable 'port2_profile' from source: play vars 13131 1726867198.07518: variable 'dhcp_interface2' from source: play vars 13131 1726867198.07604: variable 'dhcp_interface2' from source: play vars 13131 1726867198.07625: variable 'controller_profile' from source: play vars 13131 1726867198.07716: variable 'controller_profile' from source: play vars 13131 1726867198.07764: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867198.07892: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867198.08230: variable 'network_connections' from source: task vars 13131 1726867198.08248: variable 'controller_profile' from source: play vars 13131 1726867198.08335: variable 'controller_profile' from source: play vars 13131 1726867198.08347: variable 'controller_device' from source: play vars 13131 1726867198.08420: variable 'controller_device' from source: play vars 13131 1726867198.08466: variable 'port1_profile' from source: play vars 13131 1726867198.08524: variable 'port1_profile' from source: play vars 13131 1726867198.08537: variable 'dhcp_interface1' from source: play vars 13131 1726867198.08618: variable 'dhcp_interface1' from source: play vars 13131 1726867198.08658: variable 'controller_profile' from source: play vars 13131 1726867198.08708: variable 'controller_profile' from source: play vars 13131 1726867198.08719: variable 'port2_profile' from source: play vars 13131 1726867198.08818: variable 'port2_profile' from source: play vars 13131 1726867198.08879: variable 'dhcp_interface2' from source: play vars 13131 1726867198.08945: variable 'dhcp_interface2' from source: play vars 13131 1726867198.08957: variable 'controller_profile' from source: play vars 13131 1726867198.09050: variable 'controller_profile' from source: play vars 13131 1726867198.09126: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867198.09227: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867198.09250: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867198.09359: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867198.09590: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867198.10158: variable 'network_connections' from source: task vars 13131 1726867198.10170: variable 'controller_profile' from source: play vars 13131 1726867198.10311: variable 'controller_profile' from source: play vars 13131 1726867198.10318: variable 'controller_device' from source: play vars 13131 1726867198.10352: variable 'controller_device' from source: play vars 13131 1726867198.10367: variable 'port1_profile' from source: play vars 13131 1726867198.10529: variable 'port1_profile' from source: play vars 13131 1726867198.10532: variable 'dhcp_interface1' from source: play vars 13131 1726867198.10535: variable 'dhcp_interface1' from source: play vars 13131 1726867198.10539: variable 'controller_profile' from source: play vars 13131 1726867198.10613: variable 'controller_profile' from source: play vars 13131 1726867198.10625: variable 'port2_profile' from source: play vars 13131 1726867198.10737: variable 'port2_profile' from source: play vars 13131 1726867198.10754: variable 'dhcp_interface2' from source: play vars 13131 1726867198.10828: variable 'dhcp_interface2' from source: play vars 13131 1726867198.10855: variable 'controller_profile' from source: play vars 13131 1726867198.10986: variable 'controller_profile' from source: play vars 13131 1726867198.10990: variable 'ansible_distribution' from source: facts 13131 1726867198.10992: variable '__network_rh_distros' from source: role '' defaults 13131 1726867198.10994: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.10995: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867198.11158: variable 'ansible_distribution' from source: facts 13131 1726867198.11167: variable '__network_rh_distros' from source: role '' defaults 13131 1726867198.11179: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.11209: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867198.11531: variable 'ansible_distribution' from source: facts 13131 1726867198.11534: variable '__network_rh_distros' from source: role '' defaults 13131 1726867198.11536: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.11890: variable 'network_provider' from source: set_fact 13131 1726867198.11893: variable 'ansible_facts' from source: unknown 13131 1726867198.13107: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13131 1726867198.13116: when evaluation is False, skipping this task 13131 1726867198.13123: _execute() done 13131 1726867198.13131: dumping result to json 13131 1726867198.13137: done dumping result, returning 13131 1726867198.13148: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-5f24-9b7a-00000000002e] 13131 1726867198.13161: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002e 13131 1726867198.13307: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002e 13131 1726867198.13399: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13131 1726867198.13449: no more pending results, returning what we have 13131 1726867198.13452: results queue empty 13131 1726867198.13453: checking for any_errors_fatal 13131 1726867198.13458: done checking for any_errors_fatal 13131 1726867198.13459: checking for max_fail_percentage 13131 1726867198.13461: done checking for max_fail_percentage 13131 1726867198.13461: checking to see if all hosts have failed and the running result is not ok 13131 1726867198.13462: done checking to see if all hosts have failed 13131 1726867198.13463: getting the remaining hosts for this loop 13131 1726867198.13464: done getting the remaining hosts for this loop 13131 1726867198.13468: getting the next task for host managed_node1 13131 1726867198.13475: done getting next task for host managed_node1 13131 1726867198.13480: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867198.13484: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867198.13498: getting variables 13131 1726867198.13501: in VariableManager get_vars() 13131 1726867198.13553: Calling all_inventory to load vars for managed_node1 13131 1726867198.13556: Calling groups_inventory to load vars for managed_node1 13131 1726867198.13558: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867198.13568: Calling all_plugins_play to load vars for managed_node1 13131 1726867198.13570: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867198.13573: Calling groups_plugins_play to load vars for managed_node1 13131 1726867198.22282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867198.24688: done with get_vars() 13131 1726867198.24710: done getting variables 13131 1726867198.24808: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:19:58 -0400 (0:00:00.350) 0:00:13.359 ****** 13131 1726867198.24857: entering _queue_task() for managed_node1/package 13131 1726867198.25308: worker is 1 (out of 1 available) 13131 1726867198.25339: exiting _queue_task() for managed_node1/package 13131 1726867198.25384: done queuing things up, now waiting for results queue to drain 13131 1726867198.25386: waiting for pending results... 13131 1726867198.25629: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867198.25813: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000002f 13131 1726867198.25888: variable 'ansible_search_path' from source: unknown 13131 1726867198.25901: variable 'ansible_search_path' from source: unknown 13131 1726867198.25915: calling self._execute() 13131 1726867198.26025: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867198.26113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867198.26116: variable 'omit' from source: magic vars 13131 1726867198.26573: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.26594: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867198.26759: variable 'network_state' from source: role '' defaults 13131 1726867198.26818: Evaluated conditional (network_state != {}): False 13131 1726867198.26822: when evaluation is False, skipping this task 13131 1726867198.26824: _execute() done 13131 1726867198.26827: dumping result to json 13131 1726867198.26829: done dumping result, returning 13131 1726867198.26832: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-00000000002f] 13131 1726867198.26881: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867198.27021: no more pending results, returning what we have 13131 1726867198.27025: results queue empty 13131 1726867198.27026: checking for any_errors_fatal 13131 1726867198.27032: done checking for any_errors_fatal 13131 1726867198.27033: checking for max_fail_percentage 13131 1726867198.27035: done checking for max_fail_percentage 13131 1726867198.27035: checking to see if all hosts have failed and the running result is not ok 13131 1726867198.27036: done checking to see if all hosts have failed 13131 1726867198.27037: getting the remaining hosts for this loop 13131 1726867198.27039: done getting the remaining hosts for this loop 13131 1726867198.27042: getting the next task for host managed_node1 13131 1726867198.27049: done getting next task for host managed_node1 13131 1726867198.27052: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867198.27055: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867198.27069: getting variables 13131 1726867198.27070: in VariableManager get_vars() 13131 1726867198.27131: Calling all_inventory to load vars for managed_node1 13131 1726867198.27133: Calling groups_inventory to load vars for managed_node1 13131 1726867198.27136: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867198.27148: Calling all_plugins_play to load vars for managed_node1 13131 1726867198.27150: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867198.27153: Calling groups_plugins_play to load vars for managed_node1 13131 1726867198.27703: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000002f 13131 1726867198.27707: WORKER PROCESS EXITING 13131 1726867198.29035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867198.31105: done with get_vars() 13131 1726867198.31127: done getting variables 13131 1726867198.31204: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:19:58 -0400 (0:00:00.063) 0:00:13.422 ****** 13131 1726867198.31240: entering _queue_task() for managed_node1/package 13131 1726867198.31715: worker is 1 (out of 1 available) 13131 1726867198.31729: exiting _queue_task() for managed_node1/package 13131 1726867198.31921: done queuing things up, now waiting for results queue to drain 13131 1726867198.31923: waiting for pending results... 13131 1726867198.32454: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867198.32993: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000030 13131 1726867198.32997: variable 'ansible_search_path' from source: unknown 13131 1726867198.33000: variable 'ansible_search_path' from source: unknown 13131 1726867198.33050: calling self._execute() 13131 1726867198.33304: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867198.33312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867198.33316: variable 'omit' from source: magic vars 13131 1726867198.34316: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.34403: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867198.34728: variable 'network_state' from source: role '' defaults 13131 1726867198.34751: Evaluated conditional (network_state != {}): False 13131 1726867198.34763: when evaluation is False, skipping this task 13131 1726867198.34770: _execute() done 13131 1726867198.34783: dumping result to json 13131 1726867198.34818: done dumping result, returning 13131 1726867198.34868: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-000000000030] 13131 1726867198.35196: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000030 13131 1726867198.35287: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000030 13131 1726867198.35386: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867198.35447: no more pending results, returning what we have 13131 1726867198.35451: results queue empty 13131 1726867198.35452: checking for any_errors_fatal 13131 1726867198.35461: done checking for any_errors_fatal 13131 1726867198.35464: checking for max_fail_percentage 13131 1726867198.35466: done checking for max_fail_percentage 13131 1726867198.35467: checking to see if all hosts have failed and the running result is not ok 13131 1726867198.35468: done checking to see if all hosts have failed 13131 1726867198.35469: getting the remaining hosts for this loop 13131 1726867198.35471: done getting the remaining hosts for this loop 13131 1726867198.35475: getting the next task for host managed_node1 13131 1726867198.35485: done getting next task for host managed_node1 13131 1726867198.35495: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867198.35500: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867198.35520: getting variables 13131 1726867198.35522: in VariableManager get_vars() 13131 1726867198.35918: Calling all_inventory to load vars for managed_node1 13131 1726867198.35922: Calling groups_inventory to load vars for managed_node1 13131 1726867198.35925: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867198.35935: Calling all_plugins_play to load vars for managed_node1 13131 1726867198.35937: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867198.35940: Calling groups_plugins_play to load vars for managed_node1 13131 1726867198.39894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867198.43289: done with get_vars() 13131 1726867198.43335: done getting variables 13131 1726867198.43446: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:19:58 -0400 (0:00:00.122) 0:00:13.545 ****** 13131 1726867198.43476: entering _queue_task() for managed_node1/service 13131 1726867198.43484: Creating lock for service 13131 1726867198.44052: worker is 1 (out of 1 available) 13131 1726867198.44064: exiting _queue_task() for managed_node1/service 13131 1726867198.44074: done queuing things up, now waiting for results queue to drain 13131 1726867198.44076: waiting for pending results... 13131 1726867198.44353: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867198.44553: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000031 13131 1726867198.44627: variable 'ansible_search_path' from source: unknown 13131 1726867198.44631: variable 'ansible_search_path' from source: unknown 13131 1726867198.44639: calling self._execute() 13131 1726867198.44763: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867198.44783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867198.44803: variable 'omit' from source: magic vars 13131 1726867198.45340: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.45356: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867198.45614: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867198.45827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867198.49311: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867198.49542: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867198.49682: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867198.49700: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867198.49730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867198.49830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.50013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.50046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.50304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.50308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.50310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.50313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.50435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.50481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.50540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.50587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.50711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.50958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.50962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.50964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.51318: variable 'network_connections' from source: task vars 13131 1726867198.51336: variable 'controller_profile' from source: play vars 13131 1726867198.51516: variable 'controller_profile' from source: play vars 13131 1726867198.51535: variable 'controller_device' from source: play vars 13131 1726867198.51634: variable 'controller_device' from source: play vars 13131 1726867198.51650: variable 'port1_profile' from source: play vars 13131 1726867198.51784: variable 'port1_profile' from source: play vars 13131 1726867198.51838: variable 'dhcp_interface1' from source: play vars 13131 1726867198.51965: variable 'dhcp_interface1' from source: play vars 13131 1726867198.51980: variable 'controller_profile' from source: play vars 13131 1726867198.52111: variable 'controller_profile' from source: play vars 13131 1726867198.52164: variable 'port2_profile' from source: play vars 13131 1726867198.52373: variable 'port2_profile' from source: play vars 13131 1726867198.52376: variable 'dhcp_interface2' from source: play vars 13131 1726867198.52425: variable 'dhcp_interface2' from source: play vars 13131 1726867198.52495: variable 'controller_profile' from source: play vars 13131 1726867198.52558: variable 'controller_profile' from source: play vars 13131 1726867198.52721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867198.53183: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867198.53226: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867198.53300: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867198.53337: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867198.53398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867198.53424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867198.53451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.53497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867198.53580: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867198.53842: variable 'network_connections' from source: task vars 13131 1726867198.53852: variable 'controller_profile' from source: play vars 13131 1726867198.53927: variable 'controller_profile' from source: play vars 13131 1726867198.53938: variable 'controller_device' from source: play vars 13131 1726867198.54016: variable 'controller_device' from source: play vars 13131 1726867198.54112: variable 'port1_profile' from source: play vars 13131 1726867198.54115: variable 'port1_profile' from source: play vars 13131 1726867198.54119: variable 'dhcp_interface1' from source: play vars 13131 1726867198.54168: variable 'dhcp_interface1' from source: play vars 13131 1726867198.54182: variable 'controller_profile' from source: play vars 13131 1726867198.54255: variable 'controller_profile' from source: play vars 13131 1726867198.54267: variable 'port2_profile' from source: play vars 13131 1726867198.54343: variable 'port2_profile' from source: play vars 13131 1726867198.54382: variable 'dhcp_interface2' from source: play vars 13131 1726867198.54421: variable 'dhcp_interface2' from source: play vars 13131 1726867198.54441: variable 'controller_profile' from source: play vars 13131 1726867198.54509: variable 'controller_profile' from source: play vars 13131 1726867198.54548: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867198.54581: when evaluation is False, skipping this task 13131 1726867198.54584: _execute() done 13131 1726867198.54586: dumping result to json 13131 1726867198.54587: done dumping result, returning 13131 1726867198.54589: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000031] 13131 1726867198.54593: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000031 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867198.54858: no more pending results, returning what we have 13131 1726867198.54862: results queue empty 13131 1726867198.54863: checking for any_errors_fatal 13131 1726867198.54876: done checking for any_errors_fatal 13131 1726867198.54879: checking for max_fail_percentage 13131 1726867198.54882: done checking for max_fail_percentage 13131 1726867198.54882: checking to see if all hosts have failed and the running result is not ok 13131 1726867198.54883: done checking to see if all hosts have failed 13131 1726867198.54884: getting the remaining hosts for this loop 13131 1726867198.54886: done getting the remaining hosts for this loop 13131 1726867198.54890: getting the next task for host managed_node1 13131 1726867198.54902: done getting next task for host managed_node1 13131 1726867198.54906: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867198.54909: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867198.54995: getting variables 13131 1726867198.54998: in VariableManager get_vars() 13131 1726867198.55151: Calling all_inventory to load vars for managed_node1 13131 1726867198.55155: Calling groups_inventory to load vars for managed_node1 13131 1726867198.55157: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867198.55167: Calling all_plugins_play to load vars for managed_node1 13131 1726867198.55170: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867198.55173: Calling groups_plugins_play to load vars for managed_node1 13131 1726867198.55757: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000031 13131 1726867198.55760: WORKER PROCESS EXITING 13131 1726867198.57711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867198.59542: done with get_vars() 13131 1726867198.59563: done getting variables 13131 1726867198.59633: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:19:58 -0400 (0:00:00.161) 0:00:13.707 ****** 13131 1726867198.59666: entering _queue_task() for managed_node1/service 13131 1726867198.60012: worker is 1 (out of 1 available) 13131 1726867198.60030: exiting _queue_task() for managed_node1/service 13131 1726867198.60046: done queuing things up, now waiting for results queue to drain 13131 1726867198.60048: waiting for pending results... 13131 1726867198.60295: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867198.60469: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000032 13131 1726867198.60517: variable 'ansible_search_path' from source: unknown 13131 1726867198.60527: variable 'ansible_search_path' from source: unknown 13131 1726867198.60586: calling self._execute() 13131 1726867198.60713: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867198.60727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867198.60742: variable 'omit' from source: magic vars 13131 1726867198.61237: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.61240: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867198.61458: variable 'network_provider' from source: set_fact 13131 1726867198.61485: variable 'network_state' from source: role '' defaults 13131 1726867198.61500: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13131 1726867198.61511: variable 'omit' from source: magic vars 13131 1726867198.61619: variable 'omit' from source: magic vars 13131 1726867198.61770: variable 'network_service_name' from source: role '' defaults 13131 1726867198.61903: variable 'network_service_name' from source: role '' defaults 13131 1726867198.62298: variable '__network_provider_setup' from source: role '' defaults 13131 1726867198.62304: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867198.62307: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867198.62309: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867198.62458: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867198.62712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867198.64672: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867198.64729: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867198.64755: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867198.64781: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867198.64807: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867198.64862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.64883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.64905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.64936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.64947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.64980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.64999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.65017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.65045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.65056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.65411: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867198.65415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.65444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.65474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.65557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.65600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.65764: variable 'ansible_python' from source: facts 13131 1726867198.65869: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867198.66146: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867198.66238: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867198.66415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.66464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.66505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.66548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.66567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.66612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867198.66645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867198.66665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.66707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867198.66720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867198.66889: variable 'network_connections' from source: task vars 13131 1726867198.66892: variable 'controller_profile' from source: play vars 13131 1726867198.67082: variable 'controller_profile' from source: play vars 13131 1726867198.67086: variable 'controller_device' from source: play vars 13131 1726867198.67088: variable 'controller_device' from source: play vars 13131 1726867198.67090: variable 'port1_profile' from source: play vars 13131 1726867198.67092: variable 'port1_profile' from source: play vars 13131 1726867198.67097: variable 'dhcp_interface1' from source: play vars 13131 1726867198.67165: variable 'dhcp_interface1' from source: play vars 13131 1726867198.67175: variable 'controller_profile' from source: play vars 13131 1726867198.67248: variable 'controller_profile' from source: play vars 13131 1726867198.67259: variable 'port2_profile' from source: play vars 13131 1726867198.67331: variable 'port2_profile' from source: play vars 13131 1726867198.67341: variable 'dhcp_interface2' from source: play vars 13131 1726867198.67433: variable 'dhcp_interface2' from source: play vars 13131 1726867198.67436: variable 'controller_profile' from source: play vars 13131 1726867198.67500: variable 'controller_profile' from source: play vars 13131 1726867198.67628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867198.67771: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867198.67810: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867198.67844: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867198.67878: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867198.67925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867198.67945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867198.67972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867198.67996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867198.68034: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867198.68213: variable 'network_connections' from source: task vars 13131 1726867198.68218: variable 'controller_profile' from source: play vars 13131 1726867198.68288: variable 'controller_profile' from source: play vars 13131 1726867198.68300: variable 'controller_device' from source: play vars 13131 1726867198.68411: variable 'controller_device' from source: play vars 13131 1726867198.68414: variable 'port1_profile' from source: play vars 13131 1726867198.68582: variable 'port1_profile' from source: play vars 13131 1726867198.68585: variable 'dhcp_interface1' from source: play vars 13131 1726867198.68588: variable 'dhcp_interface1' from source: play vars 13131 1726867198.68590: variable 'controller_profile' from source: play vars 13131 1726867198.68609: variable 'controller_profile' from source: play vars 13131 1726867198.68620: variable 'port2_profile' from source: play vars 13131 1726867198.68687: variable 'port2_profile' from source: play vars 13131 1726867198.68697: variable 'dhcp_interface2' from source: play vars 13131 1726867198.68764: variable 'dhcp_interface2' from source: play vars 13131 1726867198.68774: variable 'controller_profile' from source: play vars 13131 1726867198.68843: variable 'controller_profile' from source: play vars 13131 1726867198.68891: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867198.68963: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867198.69243: variable 'network_connections' from source: task vars 13131 1726867198.69247: variable 'controller_profile' from source: play vars 13131 1726867198.69315: variable 'controller_profile' from source: play vars 13131 1726867198.69321: variable 'controller_device' from source: play vars 13131 1726867198.69387: variable 'controller_device' from source: play vars 13131 1726867198.69396: variable 'port1_profile' from source: play vars 13131 1726867198.69458: variable 'port1_profile' from source: play vars 13131 1726867198.69464: variable 'dhcp_interface1' from source: play vars 13131 1726867198.69538: variable 'dhcp_interface1' from source: play vars 13131 1726867198.69544: variable 'controller_profile' from source: play vars 13131 1726867198.69610: variable 'controller_profile' from source: play vars 13131 1726867198.69617: variable 'port2_profile' from source: play vars 13131 1726867198.69831: variable 'port2_profile' from source: play vars 13131 1726867198.69834: variable 'dhcp_interface2' from source: play vars 13131 1726867198.69944: variable 'dhcp_interface2' from source: play vars 13131 1726867198.69947: variable 'controller_profile' from source: play vars 13131 1726867198.69973: variable 'controller_profile' from source: play vars 13131 1726867198.70019: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867198.70097: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867198.70571: variable 'network_connections' from source: task vars 13131 1726867198.70574: variable 'controller_profile' from source: play vars 13131 1726867198.70579: variable 'controller_profile' from source: play vars 13131 1726867198.70581: variable 'controller_device' from source: play vars 13131 1726867198.70584: variable 'controller_device' from source: play vars 13131 1726867198.70586: variable 'port1_profile' from source: play vars 13131 1726867198.70657: variable 'port1_profile' from source: play vars 13131 1726867198.70674: variable 'dhcp_interface1' from source: play vars 13131 1726867198.70732: variable 'dhcp_interface1' from source: play vars 13131 1726867198.70737: variable 'controller_profile' from source: play vars 13131 1726867198.70785: variable 'controller_profile' from source: play vars 13131 1726867198.70794: variable 'port2_profile' from source: play vars 13131 1726867198.70844: variable 'port2_profile' from source: play vars 13131 1726867198.70850: variable 'dhcp_interface2' from source: play vars 13131 1726867198.70905: variable 'dhcp_interface2' from source: play vars 13131 1726867198.70910: variable 'controller_profile' from source: play vars 13131 1726867198.70962: variable 'controller_profile' from source: play vars 13131 1726867198.71007: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867198.71052: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867198.71058: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867198.71101: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867198.71231: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867198.71604: variable 'network_connections' from source: task vars 13131 1726867198.71607: variable 'controller_profile' from source: play vars 13131 1726867198.71882: variable 'controller_profile' from source: play vars 13131 1726867198.71886: variable 'controller_device' from source: play vars 13131 1726867198.71888: variable 'controller_device' from source: play vars 13131 1726867198.71890: variable 'port1_profile' from source: play vars 13131 1726867198.71895: variable 'port1_profile' from source: play vars 13131 1726867198.71897: variable 'dhcp_interface1' from source: play vars 13131 1726867198.71899: variable 'dhcp_interface1' from source: play vars 13131 1726867198.71900: variable 'controller_profile' from source: play vars 13131 1726867198.71917: variable 'controller_profile' from source: play vars 13131 1726867198.71924: variable 'port2_profile' from source: play vars 13131 1726867198.71985: variable 'port2_profile' from source: play vars 13131 1726867198.71997: variable 'dhcp_interface2' from source: play vars 13131 1726867198.72048: variable 'dhcp_interface2' from source: play vars 13131 1726867198.72129: variable 'controller_profile' from source: play vars 13131 1726867198.72132: variable 'controller_profile' from source: play vars 13131 1726867198.72135: variable 'ansible_distribution' from source: facts 13131 1726867198.72137: variable '__network_rh_distros' from source: role '' defaults 13131 1726867198.72139: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.72146: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867198.72303: variable 'ansible_distribution' from source: facts 13131 1726867198.72307: variable '__network_rh_distros' from source: role '' defaults 13131 1726867198.72311: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.72329: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867198.72605: variable 'ansible_distribution' from source: facts 13131 1726867198.72608: variable '__network_rh_distros' from source: role '' defaults 13131 1726867198.72610: variable 'ansible_distribution_major_version' from source: facts 13131 1726867198.72613: variable 'network_provider' from source: set_fact 13131 1726867198.72615: variable 'omit' from source: magic vars 13131 1726867198.72617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867198.72619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867198.72621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867198.72623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867198.72783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867198.72786: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867198.72789: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867198.72793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867198.72796: Set connection var ansible_connection to ssh 13131 1726867198.72798: Set connection var ansible_timeout to 10 13131 1726867198.72800: Set connection var ansible_shell_type to sh 13131 1726867198.72802: Set connection var ansible_shell_executable to /bin/sh 13131 1726867198.72803: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867198.72805: Set connection var ansible_pipelining to False 13131 1726867198.72813: variable 'ansible_shell_executable' from source: unknown 13131 1726867198.72815: variable 'ansible_connection' from source: unknown 13131 1726867198.72817: variable 'ansible_module_compression' from source: unknown 13131 1726867198.72822: variable 'ansible_shell_type' from source: unknown 13131 1726867198.72824: variable 'ansible_shell_executable' from source: unknown 13131 1726867198.72827: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867198.72829: variable 'ansible_pipelining' from source: unknown 13131 1726867198.72833: variable 'ansible_timeout' from source: unknown 13131 1726867198.72835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867198.72932: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867198.72942: variable 'omit' from source: magic vars 13131 1726867198.72947: starting attempt loop 13131 1726867198.72950: running the handler 13131 1726867198.73024: variable 'ansible_facts' from source: unknown 13131 1726867198.73647: _low_level_execute_command(): starting 13131 1726867198.73650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867198.74109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867198.74115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867198.74118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867198.74169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867198.74172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867198.74228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867198.75898: stdout chunk (state=3): >>>/root <<< 13131 1726867198.76022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867198.76025: stdout chunk (state=3): >>><<< 13131 1726867198.76032: stderr chunk (state=3): >>><<< 13131 1726867198.76047: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867198.76057: _low_level_execute_command(): starting 13131 1726867198.76062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146 `" && echo ansible-tmp-1726867198.7604635-13906-7838653499146="` echo /root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146 `" ) && sleep 0' 13131 1726867198.76440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867198.76471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867198.76474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867198.76478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867198.76524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867198.76528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867198.76581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867198.78447: stdout chunk (state=3): >>>ansible-tmp-1726867198.7604635-13906-7838653499146=/root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146 <<< 13131 1726867198.78553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867198.78581: stderr chunk (state=3): >>><<< 13131 1726867198.78586: stdout chunk (state=3): >>><<< 13131 1726867198.78599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867198.7604635-13906-7838653499146=/root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867198.78617: variable 'ansible_module_compression' from source: unknown 13131 1726867198.78655: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 13131 1726867198.78658: ANSIBALLZ: Acquiring lock 13131 1726867198.78661: ANSIBALLZ: Lock acquired: 140192901613856 13131 1726867198.78663: ANSIBALLZ: Creating module 13131 1726867199.07040: ANSIBALLZ: Writing module into payload 13131 1726867199.07297: ANSIBALLZ: Writing module 13131 1726867199.07329: ANSIBALLZ: Renaming module 13131 1726867199.07374: ANSIBALLZ: Done creating module 13131 1726867199.07584: variable 'ansible_facts' from source: unknown 13131 1726867199.07848: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/AnsiballZ_systemd.py 13131 1726867199.08200: Sending initial data 13131 1726867199.08203: Sent initial data (154 bytes) 13131 1726867199.09609: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867199.09656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867199.09675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867199.09700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867199.09765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867199.09804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867199.09820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867199.10090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867199.10357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867199.12003: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867199.12051: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867199.12098: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp7c0ihqfb /root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/AnsiballZ_systemd.py <<< 13131 1726867199.12109: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/AnsiballZ_systemd.py" <<< 13131 1726867199.12149: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp7c0ihqfb" to remote "/root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/AnsiballZ_systemd.py" <<< 13131 1726867199.16243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867199.16247: stdout chunk (state=3): >>><<< 13131 1726867199.16249: stderr chunk (state=3): >>><<< 13131 1726867199.16256: done transferring module to remote 13131 1726867199.16258: _low_level_execute_command(): starting 13131 1726867199.16260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/ /root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/AnsiballZ_systemd.py && sleep 0' 13131 1726867199.17645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867199.17648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867199.17651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867199.17663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867199.17676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867199.17686: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867199.17712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867199.17875: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867199.17971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867199.18092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867199.18182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867199.20049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867199.20053: stdout chunk (state=3): >>><<< 13131 1726867199.20055: stderr chunk (state=3): >>><<< 13131 1726867199.20058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867199.20060: _low_level_execute_command(): starting 13131 1726867199.20062: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/AnsiballZ_systemd.py && sleep 0' 13131 1726867199.20859: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867199.20939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867199.20976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867199.20997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867199.21017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867199.21088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867199.49918: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call or<<< 13131 1726867199.49943: stdout chunk (state=3): >>>g.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10604544", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3293945856", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "673826000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13131 1726867199.51790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867199.51842: stderr chunk (state=3): >>><<< 13131 1726867199.51848: stdout chunk (state=3): >>><<< 13131 1726867199.52099: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10604544", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3293945856", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "673826000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867199.52113: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867199.52128: _low_level_execute_command(): starting 13131 1726867199.52145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867198.7604635-13906-7838653499146/ > /dev/null 2>&1 && sleep 0' 13131 1726867199.52991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867199.53093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867199.53197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867199.53222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867199.53260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867199.53327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867199.55282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867199.55285: stdout chunk (state=3): >>><<< 13131 1726867199.55287: stderr chunk (state=3): >>><<< 13131 1726867199.55289: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867199.55294: handler run complete 13131 1726867199.55325: attempt loop complete, returning result 13131 1726867199.55328: _execute() done 13131 1726867199.55331: dumping result to json 13131 1726867199.55350: done dumping result, returning 13131 1726867199.55359: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-5f24-9b7a-000000000032] 13131 1726867199.55362: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000032 13131 1726867199.55655: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000032 13131 1726867199.55658: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867199.55722: no more pending results, returning what we have 13131 1726867199.55726: results queue empty 13131 1726867199.55727: checking for any_errors_fatal 13131 1726867199.55733: done checking for any_errors_fatal 13131 1726867199.55734: checking for max_fail_percentage 13131 1726867199.55736: done checking for max_fail_percentage 13131 1726867199.55736: checking to see if all hosts have failed and the running result is not ok 13131 1726867199.55737: done checking to see if all hosts have failed 13131 1726867199.55738: getting the remaining hosts for this loop 13131 1726867199.55739: done getting the remaining hosts for this loop 13131 1726867199.55743: getting the next task for host managed_node1 13131 1726867199.55750: done getting next task for host managed_node1 13131 1726867199.55754: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867199.55757: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867199.55769: getting variables 13131 1726867199.55770: in VariableManager get_vars() 13131 1726867199.55827: Calling all_inventory to load vars for managed_node1 13131 1726867199.55830: Calling groups_inventory to load vars for managed_node1 13131 1726867199.55833: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867199.55843: Calling all_plugins_play to load vars for managed_node1 13131 1726867199.55846: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867199.55850: Calling groups_plugins_play to load vars for managed_node1 13131 1726867199.57636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867199.58774: done with get_vars() 13131 1726867199.58809: done getting variables 13131 1726867199.58939: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:19:59 -0400 (0:00:00.993) 0:00:14.700 ****** 13131 1726867199.59016: entering _queue_task() for managed_node1/service 13131 1726867199.59535: worker is 1 (out of 1 available) 13131 1726867199.59548: exiting _queue_task() for managed_node1/service 13131 1726867199.59559: done queuing things up, now waiting for results queue to drain 13131 1726867199.59560: waiting for pending results... 13131 1726867199.59942: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867199.59947: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000033 13131 1726867199.59951: variable 'ansible_search_path' from source: unknown 13131 1726867199.59954: variable 'ansible_search_path' from source: unknown 13131 1726867199.59965: calling self._execute() 13131 1726867199.60044: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867199.60079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867199.60083: variable 'omit' from source: magic vars 13131 1726867199.60660: variable 'ansible_distribution_major_version' from source: facts 13131 1726867199.60669: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867199.60769: variable 'network_provider' from source: set_fact 13131 1726867199.60772: Evaluated conditional (network_provider == "nm"): True 13131 1726867199.60883: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867199.60964: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867199.61108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867199.64287: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867199.64402: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867199.64509: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867199.64579: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867199.64641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867199.64989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867199.64993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867199.64996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867199.64998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867199.65000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867199.65002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867199.65003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867199.65005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867199.65049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867199.65067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867199.65122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867199.65153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867199.65186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867199.65246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867199.65291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867199.65456: variable 'network_connections' from source: task vars 13131 1726867199.65474: variable 'controller_profile' from source: play vars 13131 1726867199.65573: variable 'controller_profile' from source: play vars 13131 1726867199.65599: variable 'controller_device' from source: play vars 13131 1726867199.65825: variable 'controller_device' from source: play vars 13131 1726867199.65954: variable 'port1_profile' from source: play vars 13131 1726867199.66093: variable 'port1_profile' from source: play vars 13131 1726867199.66101: variable 'dhcp_interface1' from source: play vars 13131 1726867199.66319: variable 'dhcp_interface1' from source: play vars 13131 1726867199.66353: variable 'controller_profile' from source: play vars 13131 1726867199.66523: variable 'controller_profile' from source: play vars 13131 1726867199.66527: variable 'port2_profile' from source: play vars 13131 1726867199.66575: variable 'port2_profile' from source: play vars 13131 1726867199.66595: variable 'dhcp_interface2' from source: play vars 13131 1726867199.66788: variable 'dhcp_interface2' from source: play vars 13131 1726867199.66800: variable 'controller_profile' from source: play vars 13131 1726867199.66909: variable 'controller_profile' from source: play vars 13131 1726867199.67101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867199.67314: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867199.67359: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867199.67419: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867199.67451: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867199.67516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867199.67541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867199.67557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867199.67638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867199.67660: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867199.67932: variable 'network_connections' from source: task vars 13131 1726867199.67935: variable 'controller_profile' from source: play vars 13131 1726867199.67971: variable 'controller_profile' from source: play vars 13131 1726867199.67979: variable 'controller_device' from source: play vars 13131 1726867199.68054: variable 'controller_device' from source: play vars 13131 1726867199.68057: variable 'port1_profile' from source: play vars 13131 1726867199.68116: variable 'port1_profile' from source: play vars 13131 1726867199.68127: variable 'dhcp_interface1' from source: play vars 13131 1726867199.68195: variable 'dhcp_interface1' from source: play vars 13131 1726867199.68201: variable 'controller_profile' from source: play vars 13131 1726867199.68256: variable 'controller_profile' from source: play vars 13131 1726867199.68268: variable 'port2_profile' from source: play vars 13131 1726867199.68332: variable 'port2_profile' from source: play vars 13131 1726867199.68343: variable 'dhcp_interface2' from source: play vars 13131 1726867199.68407: variable 'dhcp_interface2' from source: play vars 13131 1726867199.68413: variable 'controller_profile' from source: play vars 13131 1726867199.68468: variable 'controller_profile' from source: play vars 13131 1726867199.68509: Evaluated conditional (__network_wpa_supplicant_required): False 13131 1726867199.68513: when evaluation is False, skipping this task 13131 1726867199.68515: _execute() done 13131 1726867199.68518: dumping result to json 13131 1726867199.68520: done dumping result, returning 13131 1726867199.68529: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-5f24-9b7a-000000000033] 13131 1726867199.68537: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000033 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13131 1726867199.68799: no more pending results, returning what we have 13131 1726867199.68803: results queue empty 13131 1726867199.68803: checking for any_errors_fatal 13131 1726867199.68820: done checking for any_errors_fatal 13131 1726867199.68821: checking for max_fail_percentage 13131 1726867199.68822: done checking for max_fail_percentage 13131 1726867199.68823: checking to see if all hosts have failed and the running result is not ok 13131 1726867199.68824: done checking to see if all hosts have failed 13131 1726867199.68824: getting the remaining hosts for this loop 13131 1726867199.68835: done getting the remaining hosts for this loop 13131 1726867199.68840: getting the next task for host managed_node1 13131 1726867199.68848: done getting next task for host managed_node1 13131 1726867199.68852: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867199.68854: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867199.68867: getting variables 13131 1726867199.68869: in VariableManager get_vars() 13131 1726867199.68932: Calling all_inventory to load vars for managed_node1 13131 1726867199.69037: Calling groups_inventory to load vars for managed_node1 13131 1726867199.69041: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867199.69054: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000033 13131 1726867199.69061: WORKER PROCESS EXITING 13131 1726867199.69096: Calling all_plugins_play to load vars for managed_node1 13131 1726867199.69100: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867199.69104: Calling groups_plugins_play to load vars for managed_node1 13131 1726867199.70735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867199.71847: done with get_vars() 13131 1726867199.71861: done getting variables 13131 1726867199.71909: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:19:59 -0400 (0:00:00.129) 0:00:14.829 ****** 13131 1726867199.71932: entering _queue_task() for managed_node1/service 13131 1726867199.72235: worker is 1 (out of 1 available) 13131 1726867199.72248: exiting _queue_task() for managed_node1/service 13131 1726867199.72259: done queuing things up, now waiting for results queue to drain 13131 1726867199.72260: waiting for pending results... 13131 1726867199.72600: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867199.72674: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000034 13131 1726867199.72691: variable 'ansible_search_path' from source: unknown 13131 1726867199.72695: variable 'ansible_search_path' from source: unknown 13131 1726867199.72737: calling self._execute() 13131 1726867199.72856: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867199.72860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867199.72863: variable 'omit' from source: magic vars 13131 1726867199.73236: variable 'ansible_distribution_major_version' from source: facts 13131 1726867199.73247: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867199.73394: variable 'network_provider' from source: set_fact 13131 1726867199.73401: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867199.73404: when evaluation is False, skipping this task 13131 1726867199.73406: _execute() done 13131 1726867199.73409: dumping result to json 13131 1726867199.73411: done dumping result, returning 13131 1726867199.73414: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-5f24-9b7a-000000000034] 13131 1726867199.73416: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000034 13131 1726867199.73495: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000034 13131 1726867199.73498: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867199.73552: no more pending results, returning what we have 13131 1726867199.73555: results queue empty 13131 1726867199.73556: checking for any_errors_fatal 13131 1726867199.73562: done checking for any_errors_fatal 13131 1726867199.73563: checking for max_fail_percentage 13131 1726867199.73564: done checking for max_fail_percentage 13131 1726867199.73565: checking to see if all hosts have failed and the running result is not ok 13131 1726867199.73566: done checking to see if all hosts have failed 13131 1726867199.73566: getting the remaining hosts for this loop 13131 1726867199.73567: done getting the remaining hosts for this loop 13131 1726867199.73570: getting the next task for host managed_node1 13131 1726867199.73576: done getting next task for host managed_node1 13131 1726867199.73582: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867199.73585: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867199.73598: getting variables 13131 1726867199.73600: in VariableManager get_vars() 13131 1726867199.73642: Calling all_inventory to load vars for managed_node1 13131 1726867199.73645: Calling groups_inventory to load vars for managed_node1 13131 1726867199.73647: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867199.73655: Calling all_plugins_play to load vars for managed_node1 13131 1726867199.73657: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867199.73659: Calling groups_plugins_play to load vars for managed_node1 13131 1726867199.74564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867199.76219: done with get_vars() 13131 1726867199.76240: done getting variables 13131 1726867199.76320: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:19:59 -0400 (0:00:00.044) 0:00:14.874 ****** 13131 1726867199.76352: entering _queue_task() for managed_node1/copy 13131 1726867199.76660: worker is 1 (out of 1 available) 13131 1726867199.76673: exiting _queue_task() for managed_node1/copy 13131 1726867199.76893: done queuing things up, now waiting for results queue to drain 13131 1726867199.76897: waiting for pending results... 13131 1726867199.77134: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867199.77205: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000035 13131 1726867199.77209: variable 'ansible_search_path' from source: unknown 13131 1726867199.77211: variable 'ansible_search_path' from source: unknown 13131 1726867199.77214: calling self._execute() 13131 1726867199.77325: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867199.77329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867199.77332: variable 'omit' from source: magic vars 13131 1726867199.77699: variable 'ansible_distribution_major_version' from source: facts 13131 1726867199.77703: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867199.77817: variable 'network_provider' from source: set_fact 13131 1726867199.77821: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867199.77824: when evaluation is False, skipping this task 13131 1726867199.77826: _execute() done 13131 1726867199.77829: dumping result to json 13131 1726867199.77831: done dumping result, returning 13131 1726867199.77835: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-5f24-9b7a-000000000035] 13131 1726867199.77838: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000035 13131 1726867199.78145: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000035 13131 1726867199.78149: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867199.78195: no more pending results, returning what we have 13131 1726867199.78198: results queue empty 13131 1726867199.78199: checking for any_errors_fatal 13131 1726867199.78203: done checking for any_errors_fatal 13131 1726867199.78204: checking for max_fail_percentage 13131 1726867199.78205: done checking for max_fail_percentage 13131 1726867199.78206: checking to see if all hosts have failed and the running result is not ok 13131 1726867199.78207: done checking to see if all hosts have failed 13131 1726867199.78207: getting the remaining hosts for this loop 13131 1726867199.78209: done getting the remaining hosts for this loop 13131 1726867199.78212: getting the next task for host managed_node1 13131 1726867199.78217: done getting next task for host managed_node1 13131 1726867199.78220: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867199.78223: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867199.78235: getting variables 13131 1726867199.78236: in VariableManager get_vars() 13131 1726867199.78280: Calling all_inventory to load vars for managed_node1 13131 1726867199.78283: Calling groups_inventory to load vars for managed_node1 13131 1726867199.78285: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867199.78295: Calling all_plugins_play to load vars for managed_node1 13131 1726867199.78298: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867199.78301: Calling groups_plugins_play to load vars for managed_node1 13131 1726867199.79669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867199.81435: done with get_vars() 13131 1726867199.81464: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:19:59 -0400 (0:00:00.052) 0:00:14.926 ****** 13131 1726867199.81579: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867199.81584: Creating lock for fedora.linux_system_roles.network_connections 13131 1726867199.81937: worker is 1 (out of 1 available) 13131 1726867199.81949: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867199.81961: done queuing things up, now waiting for results queue to drain 13131 1726867199.81962: waiting for pending results... 13131 1726867199.82511: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867199.82517: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000036 13131 1726867199.82520: variable 'ansible_search_path' from source: unknown 13131 1726867199.82523: variable 'ansible_search_path' from source: unknown 13131 1726867199.82527: calling self._execute() 13131 1726867199.82593: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867199.82598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867199.82609: variable 'omit' from source: magic vars 13131 1726867199.83083: variable 'ansible_distribution_major_version' from source: facts 13131 1726867199.83087: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867199.83090: variable 'omit' from source: magic vars 13131 1726867199.83092: variable 'omit' from source: magic vars 13131 1726867199.83484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867199.85959: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867199.86026: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867199.86062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867199.86104: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867199.86129: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867199.86211: variable 'network_provider' from source: set_fact 13131 1726867199.86344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867199.86386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867199.86420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867199.86460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867199.86474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867199.86553: variable 'omit' from source: magic vars 13131 1726867199.86668: variable 'omit' from source: magic vars 13131 1726867199.86774: variable 'network_connections' from source: task vars 13131 1726867199.86786: variable 'controller_profile' from source: play vars 13131 1726867199.86914: variable 'controller_profile' from source: play vars 13131 1726867199.86923: variable 'controller_device' from source: play vars 13131 1726867199.86984: variable 'controller_device' from source: play vars 13131 1726867199.86994: variable 'port1_profile' from source: play vars 13131 1726867199.87053: variable 'port1_profile' from source: play vars 13131 1726867199.87060: variable 'dhcp_interface1' from source: play vars 13131 1726867199.87182: variable 'dhcp_interface1' from source: play vars 13131 1726867199.87185: variable 'controller_profile' from source: play vars 13131 1726867199.87194: variable 'controller_profile' from source: play vars 13131 1726867199.87203: variable 'port2_profile' from source: play vars 13131 1726867199.87262: variable 'port2_profile' from source: play vars 13131 1726867199.87270: variable 'dhcp_interface2' from source: play vars 13131 1726867199.87334: variable 'dhcp_interface2' from source: play vars 13131 1726867199.87340: variable 'controller_profile' from source: play vars 13131 1726867199.87404: variable 'controller_profile' from source: play vars 13131 1726867199.87613: variable 'omit' from source: magic vars 13131 1726867199.87616: variable '__lsr_ansible_managed' from source: task vars 13131 1726867199.87663: variable '__lsr_ansible_managed' from source: task vars 13131 1726867199.87841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13131 1726867199.88144: Loaded config def from plugin (lookup/template) 13131 1726867199.88492: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13131 1726867199.88496: File lookup term: get_ansible_managed.j2 13131 1726867199.88498: variable 'ansible_search_path' from source: unknown 13131 1726867199.88501: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13131 1726867199.88505: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13131 1726867199.88507: variable 'ansible_search_path' from source: unknown 13131 1726867199.97687: variable 'ansible_managed' from source: unknown 13131 1726867199.97819: variable 'omit' from source: magic vars 13131 1726867199.97843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867199.97875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867199.97894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867199.97915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867199.97925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867199.97951: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867199.97957: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867199.97965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867199.98072: Set connection var ansible_connection to ssh 13131 1726867199.98076: Set connection var ansible_timeout to 10 13131 1726867199.98080: Set connection var ansible_shell_type to sh 13131 1726867199.98082: Set connection var ansible_shell_executable to /bin/sh 13131 1726867199.98484: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867199.98496: Set connection var ansible_pipelining to False 13131 1726867199.98499: variable 'ansible_shell_executable' from source: unknown 13131 1726867199.98501: variable 'ansible_connection' from source: unknown 13131 1726867199.98583: variable 'ansible_module_compression' from source: unknown 13131 1726867199.98587: variable 'ansible_shell_type' from source: unknown 13131 1726867199.98590: variable 'ansible_shell_executable' from source: unknown 13131 1726867199.98592: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867199.98594: variable 'ansible_pipelining' from source: unknown 13131 1726867199.98596: variable 'ansible_timeout' from source: unknown 13131 1726867199.98598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867199.98601: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867199.98603: variable 'omit' from source: magic vars 13131 1726867199.98605: starting attempt loop 13131 1726867199.98607: running the handler 13131 1726867199.98609: _low_level_execute_command(): starting 13131 1726867199.98612: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867199.99287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867199.99291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867199.99293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867199.99295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867199.99297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867199.99299: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867199.99415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867199.99725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867199.99875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867200.01550: stdout chunk (state=3): >>>/root <<< 13131 1726867200.01658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867200.01693: stderr chunk (state=3): >>><<< 13131 1726867200.01703: stdout chunk (state=3): >>><<< 13131 1726867200.01730: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867200.01743: _low_level_execute_command(): starting 13131 1726867200.01749: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578 `" && echo ansible-tmp-1726867200.0173013-13971-217820798058578="` echo /root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578 `" ) && sleep 0' 13131 1726867200.03087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867200.03090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867200.03093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867200.03132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867200.03289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867200.03546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867200.05428: stdout chunk (state=3): >>>ansible-tmp-1726867200.0173013-13971-217820798058578=/root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578 <<< 13131 1726867200.05567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867200.05571: stdout chunk (state=3): >>><<< 13131 1726867200.05586: stderr chunk (state=3): >>><<< 13131 1726867200.05606: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867200.0173013-13971-217820798058578=/root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867200.05653: variable 'ansible_module_compression' from source: unknown 13131 1726867200.05884: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 13131 1726867200.05887: ANSIBALLZ: Acquiring lock 13131 1726867200.05890: ANSIBALLZ: Lock acquired: 140192896239856 13131 1726867200.05892: ANSIBALLZ: Creating module 13131 1726867200.29245: ANSIBALLZ: Writing module into payload 13131 1726867200.29471: ANSIBALLZ: Writing module 13131 1726867200.29491: ANSIBALLZ: Renaming module 13131 1726867200.29497: ANSIBALLZ: Done creating module 13131 1726867200.29517: variable 'ansible_facts' from source: unknown 13131 1726867200.29588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/AnsiballZ_network_connections.py 13131 1726867200.29690: Sending initial data 13131 1726867200.29696: Sent initial data (168 bytes) 13131 1726867200.30153: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867200.30157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867200.30159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867200.30161: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867200.30163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867200.30217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867200.30224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867200.30274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867200.31907: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867200.31910: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867200.31946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867200.31989: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpz3rvp6o0 /root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/AnsiballZ_network_connections.py <<< 13131 1726867200.31997: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/AnsiballZ_network_connections.py" <<< 13131 1726867200.32033: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpz3rvp6o0" to remote "/root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/AnsiballZ_network_connections.py" <<< 13131 1726867200.32040: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/AnsiballZ_network_connections.py" <<< 13131 1726867200.32835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867200.32872: stderr chunk (state=3): >>><<< 13131 1726867200.32875: stdout chunk (state=3): >>><<< 13131 1726867200.32916: done transferring module to remote 13131 1726867200.32930: _low_level_execute_command(): starting 13131 1726867200.32933: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/ /root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/AnsiballZ_network_connections.py && sleep 0' 13131 1726867200.33432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867200.33436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867200.33438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867200.33440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867200.33443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867200.33531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867200.33537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867200.33600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867200.35336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867200.35355: stderr chunk (state=3): >>><<< 13131 1726867200.35359: stdout chunk (state=3): >>><<< 13131 1726867200.35383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867200.35389: _low_level_execute_command(): starting 13131 1726867200.35392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/AnsiballZ_network_connections.py && sleep 0' 13131 1726867200.35983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867200.35986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867200.35989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867200.36025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867200.36035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867200.36088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867200.77252: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13131 1726867200.79180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867200.79225: stdout chunk (state=3): >>><<< 13131 1726867200.79228: stderr chunk (state=3): >>><<< 13131 1726867200.79333: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867200.79704: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867200.79713: _low_level_execute_command(): starting 13131 1726867200.79715: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867200.0173013-13971-217820798058578/ > /dev/null 2>&1 && sleep 0' 13131 1726867200.80888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867200.80892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867200.80969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867200.81029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867200.81058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867200.81214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867200.81359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867200.83179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867200.83426: stderr chunk (state=3): >>><<< 13131 1726867200.83430: stdout chunk (state=3): >>><<< 13131 1726867200.83433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867200.83435: handler run complete 13131 1726867200.83437: attempt loop complete, returning result 13131 1726867200.83439: _execute() done 13131 1726867200.83441: dumping result to json 13131 1726867200.83443: done dumping result, returning 13131 1726867200.83445: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-5f24-9b7a-000000000036] 13131 1726867200.83447: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000036 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active) 13131 1726867200.83776: no more pending results, returning what we have 13131 1726867200.83782: results queue empty 13131 1726867200.83783: checking for any_errors_fatal 13131 1726867200.83788: done checking for any_errors_fatal 13131 1726867200.83789: checking for max_fail_percentage 13131 1726867200.83791: done checking for max_fail_percentage 13131 1726867200.83792: checking to see if all hosts have failed and the running result is not ok 13131 1726867200.83793: done checking to see if all hosts have failed 13131 1726867200.83793: getting the remaining hosts for this loop 13131 1726867200.83795: done getting the remaining hosts for this loop 13131 1726867200.83798: getting the next task for host managed_node1 13131 1726867200.83806: done getting next task for host managed_node1 13131 1726867200.83809: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867200.83812: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867200.83822: getting variables 13131 1726867200.83824: in VariableManager get_vars() 13131 1726867200.83876: Calling all_inventory to load vars for managed_node1 13131 1726867200.84357: Calling groups_inventory to load vars for managed_node1 13131 1726867200.84360: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867200.84371: Calling all_plugins_play to load vars for managed_node1 13131 1726867200.84374: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867200.84379: Calling groups_plugins_play to load vars for managed_node1 13131 1726867200.85091: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000036 13131 1726867200.85094: WORKER PROCESS EXITING 13131 1726867200.86083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867200.88573: done with get_vars() 13131 1726867200.88599: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:00 -0400 (0:00:01.071) 0:00:15.997 ****** 13131 1726867200.88694: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867200.88696: Creating lock for fedora.linux_system_roles.network_state 13131 1726867200.89258: worker is 1 (out of 1 available) 13131 1726867200.89272: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867200.89578: done queuing things up, now waiting for results queue to drain 13131 1726867200.89580: waiting for pending results... 13131 1726867200.89807: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867200.89938: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000037 13131 1726867200.89959: variable 'ansible_search_path' from source: unknown 13131 1726867200.89966: variable 'ansible_search_path' from source: unknown 13131 1726867200.90009: calling self._execute() 13131 1726867200.90103: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867200.90143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867200.90147: variable 'omit' from source: magic vars 13131 1726867200.90511: variable 'ansible_distribution_major_version' from source: facts 13131 1726867200.90528: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867200.90683: variable 'network_state' from source: role '' defaults 13131 1726867200.90687: Evaluated conditional (network_state != {}): False 13131 1726867200.90689: when evaluation is False, skipping this task 13131 1726867200.90691: _execute() done 13131 1726867200.90693: dumping result to json 13131 1726867200.90695: done dumping result, returning 13131 1726867200.90697: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-5f24-9b7a-000000000037] 13131 1726867200.90699: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000037 13131 1726867200.90985: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000037 13131 1726867200.90988: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867200.91027: no more pending results, returning what we have 13131 1726867200.91031: results queue empty 13131 1726867200.91031: checking for any_errors_fatal 13131 1726867200.91041: done checking for any_errors_fatal 13131 1726867200.91042: checking for max_fail_percentage 13131 1726867200.91043: done checking for max_fail_percentage 13131 1726867200.91044: checking to see if all hosts have failed and the running result is not ok 13131 1726867200.91045: done checking to see if all hosts have failed 13131 1726867200.91045: getting the remaining hosts for this loop 13131 1726867200.91047: done getting the remaining hosts for this loop 13131 1726867200.91050: getting the next task for host managed_node1 13131 1726867200.91055: done getting next task for host managed_node1 13131 1726867200.91058: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867200.91061: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867200.91075: getting variables 13131 1726867200.91076: in VariableManager get_vars() 13131 1726867200.91120: Calling all_inventory to load vars for managed_node1 13131 1726867200.91123: Calling groups_inventory to load vars for managed_node1 13131 1726867200.91126: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867200.91135: Calling all_plugins_play to load vars for managed_node1 13131 1726867200.91137: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867200.91140: Calling groups_plugins_play to load vars for managed_node1 13131 1726867200.92661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867200.95321: done with get_vars() 13131 1726867200.95344: done getting variables 13131 1726867200.95405: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:00 -0400 (0:00:00.067) 0:00:16.064 ****** 13131 1726867200.95438: entering _queue_task() for managed_node1/debug 13131 1726867200.95904: worker is 1 (out of 1 available) 13131 1726867200.95914: exiting _queue_task() for managed_node1/debug 13131 1726867200.95923: done queuing things up, now waiting for results queue to drain 13131 1726867200.95925: waiting for pending results... 13131 1726867200.96264: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867200.96784: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000038 13131 1726867200.96788: variable 'ansible_search_path' from source: unknown 13131 1726867200.96791: variable 'ansible_search_path' from source: unknown 13131 1726867200.96794: calling self._execute() 13131 1726867200.96902: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867200.96915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867200.96927: variable 'omit' from source: magic vars 13131 1726867200.97613: variable 'ansible_distribution_major_version' from source: facts 13131 1726867200.97666: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867200.97869: variable 'omit' from source: magic vars 13131 1726867200.97872: variable 'omit' from source: magic vars 13131 1726867200.97875: variable 'omit' from source: magic vars 13131 1726867200.98016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867200.98057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867200.98105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867200.98205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867200.98222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867200.98257: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867200.98265: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867200.98307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867200.98402: Set connection var ansible_connection to ssh 13131 1726867200.98467: Set connection var ansible_timeout to 10 13131 1726867200.98475: Set connection var ansible_shell_type to sh 13131 1726867200.98492: Set connection var ansible_shell_executable to /bin/sh 13131 1726867200.98506: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867200.98519: Set connection var ansible_pipelining to False 13131 1726867200.98544: variable 'ansible_shell_executable' from source: unknown 13131 1726867200.98626: variable 'ansible_connection' from source: unknown 13131 1726867200.98630: variable 'ansible_module_compression' from source: unknown 13131 1726867200.98632: variable 'ansible_shell_type' from source: unknown 13131 1726867200.98634: variable 'ansible_shell_executable' from source: unknown 13131 1726867200.98636: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867200.98637: variable 'ansible_pipelining' from source: unknown 13131 1726867200.98639: variable 'ansible_timeout' from source: unknown 13131 1726867200.98641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867200.98720: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867200.98739: variable 'omit' from source: magic vars 13131 1726867200.98748: starting attempt loop 13131 1726867200.98755: running the handler 13131 1726867200.98883: variable '__network_connections_result' from source: set_fact 13131 1726867200.98949: handler run complete 13131 1726867200.98974: attempt loop complete, returning result 13131 1726867200.98984: _execute() done 13131 1726867200.98992: dumping result to json 13131 1726867200.98999: done dumping result, returning 13131 1726867200.99010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-5f24-9b7a-000000000038] 13131 1726867200.99019: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000038 13131 1726867200.99131: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000038 13131 1726867200.99134: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)" ] } 13131 1726867200.99233: no more pending results, returning what we have 13131 1726867200.99237: results queue empty 13131 1726867200.99238: checking for any_errors_fatal 13131 1726867200.99245: done checking for any_errors_fatal 13131 1726867200.99245: checking for max_fail_percentage 13131 1726867200.99247: done checking for max_fail_percentage 13131 1726867200.99248: checking to see if all hosts have failed and the running result is not ok 13131 1726867200.99249: done checking to see if all hosts have failed 13131 1726867200.99249: getting the remaining hosts for this loop 13131 1726867200.99251: done getting the remaining hosts for this loop 13131 1726867200.99254: getting the next task for host managed_node1 13131 1726867200.99260: done getting next task for host managed_node1 13131 1726867200.99264: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867200.99268: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867200.99281: getting variables 13131 1726867200.99282: in VariableManager get_vars() 13131 1726867200.99333: Calling all_inventory to load vars for managed_node1 13131 1726867200.99336: Calling groups_inventory to load vars for managed_node1 13131 1726867200.99338: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867200.99348: Calling all_plugins_play to load vars for managed_node1 13131 1726867200.99350: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867200.99353: Calling groups_plugins_play to load vars for managed_node1 13131 1726867201.02523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867201.04099: done with get_vars() 13131 1726867201.04120: done getting variables 13131 1726867201.04174: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:01 -0400 (0:00:00.087) 0:00:16.152 ****** 13131 1726867201.04212: entering _queue_task() for managed_node1/debug 13131 1726867201.04470: worker is 1 (out of 1 available) 13131 1726867201.04585: exiting _queue_task() for managed_node1/debug 13131 1726867201.04596: done queuing things up, now waiting for results queue to drain 13131 1726867201.04598: waiting for pending results... 13131 1726867201.04833: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867201.04931: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000039 13131 1726867201.04935: variable 'ansible_search_path' from source: unknown 13131 1726867201.04937: variable 'ansible_search_path' from source: unknown 13131 1726867201.04975: calling self._execute() 13131 1726867201.05084: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.05088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.05090: variable 'omit' from source: magic vars 13131 1726867201.05449: variable 'ansible_distribution_major_version' from source: facts 13131 1726867201.05476: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867201.05682: variable 'omit' from source: magic vars 13131 1726867201.05686: variable 'omit' from source: magic vars 13131 1726867201.05688: variable 'omit' from source: magic vars 13131 1726867201.05691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867201.05693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867201.05695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867201.05698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867201.05711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867201.05743: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867201.05751: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.05758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.05859: Set connection var ansible_connection to ssh 13131 1726867201.05872: Set connection var ansible_timeout to 10 13131 1726867201.05880: Set connection var ansible_shell_type to sh 13131 1726867201.05892: Set connection var ansible_shell_executable to /bin/sh 13131 1726867201.05906: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867201.05916: Set connection var ansible_pipelining to False 13131 1726867201.05941: variable 'ansible_shell_executable' from source: unknown 13131 1726867201.05948: variable 'ansible_connection' from source: unknown 13131 1726867201.05955: variable 'ansible_module_compression' from source: unknown 13131 1726867201.05961: variable 'ansible_shell_type' from source: unknown 13131 1726867201.05967: variable 'ansible_shell_executable' from source: unknown 13131 1726867201.06031: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.06033: variable 'ansible_pipelining' from source: unknown 13131 1726867201.06036: variable 'ansible_timeout' from source: unknown 13131 1726867201.06039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.06127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867201.06146: variable 'omit' from source: magic vars 13131 1726867201.06155: starting attempt loop 13131 1726867201.06161: running the handler 13131 1726867201.06214: variable '__network_connections_result' from source: set_fact 13131 1726867201.06296: variable '__network_connections_result' from source: set_fact 13131 1726867201.06462: handler run complete 13131 1726867201.06502: attempt loop complete, returning result 13131 1726867201.06580: _execute() done 13131 1726867201.06583: dumping result to json 13131 1726867201.06585: done dumping result, returning 13131 1726867201.06588: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-5f24-9b7a-000000000039] 13131 1726867201.06590: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000039 13131 1726867201.06659: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000039 13131 1726867201.06662: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, df9218e5-fcde-46a8-b91e-9607fcfd47af (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)" ] } } 13131 1726867201.06779: no more pending results, returning what we have 13131 1726867201.06783: results queue empty 13131 1726867201.06789: checking for any_errors_fatal 13131 1726867201.06795: done checking for any_errors_fatal 13131 1726867201.06796: checking for max_fail_percentage 13131 1726867201.06797: done checking for max_fail_percentage 13131 1726867201.06798: checking to see if all hosts have failed and the running result is not ok 13131 1726867201.06798: done checking to see if all hosts have failed 13131 1726867201.06799: getting the remaining hosts for this loop 13131 1726867201.06800: done getting the remaining hosts for this loop 13131 1726867201.06804: getting the next task for host managed_node1 13131 1726867201.06810: done getting next task for host managed_node1 13131 1726867201.06814: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867201.06817: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867201.06828: getting variables 13131 1726867201.06830: in VariableManager get_vars() 13131 1726867201.06990: Calling all_inventory to load vars for managed_node1 13131 1726867201.06994: Calling groups_inventory to load vars for managed_node1 13131 1726867201.06997: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867201.07006: Calling all_plugins_play to load vars for managed_node1 13131 1726867201.07009: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867201.07012: Calling groups_plugins_play to load vars for managed_node1 13131 1726867201.08360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867201.11411: done with get_vars() 13131 1726867201.11432: done getting variables 13131 1726867201.11636: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:01 -0400 (0:00:00.074) 0:00:16.227 ****** 13131 1726867201.11668: entering _queue_task() for managed_node1/debug 13131 1726867201.11958: worker is 1 (out of 1 available) 13131 1726867201.11970: exiting _queue_task() for managed_node1/debug 13131 1726867201.12184: done queuing things up, now waiting for results queue to drain 13131 1726867201.12186: waiting for pending results... 13131 1726867201.12394: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867201.12432: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000003a 13131 1726867201.12458: variable 'ansible_search_path' from source: unknown 13131 1726867201.12466: variable 'ansible_search_path' from source: unknown 13131 1726867201.12508: calling self._execute() 13131 1726867201.12600: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.12613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.12631: variable 'omit' from source: magic vars 13131 1726867201.13154: variable 'ansible_distribution_major_version' from source: facts 13131 1726867201.13171: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867201.13294: variable 'network_state' from source: role '' defaults 13131 1726867201.13325: Evaluated conditional (network_state != {}): False 13131 1726867201.13328: when evaluation is False, skipping this task 13131 1726867201.13330: _execute() done 13131 1726867201.13332: dumping result to json 13131 1726867201.13333: done dumping result, returning 13131 1726867201.13337: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-5f24-9b7a-00000000003a] 13131 1726867201.13435: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000003a 13131 1726867201.13495: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000003a 13131 1726867201.13498: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 13131 1726867201.13592: no more pending results, returning what we have 13131 1726867201.13596: results queue empty 13131 1726867201.13597: checking for any_errors_fatal 13131 1726867201.13606: done checking for any_errors_fatal 13131 1726867201.13607: checking for max_fail_percentage 13131 1726867201.13609: done checking for max_fail_percentage 13131 1726867201.13610: checking to see if all hosts have failed and the running result is not ok 13131 1726867201.13611: done checking to see if all hosts have failed 13131 1726867201.13612: getting the remaining hosts for this loop 13131 1726867201.13613: done getting the remaining hosts for this loop 13131 1726867201.13617: getting the next task for host managed_node1 13131 1726867201.13625: done getting next task for host managed_node1 13131 1726867201.13629: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867201.13633: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867201.13650: getting variables 13131 1726867201.13652: in VariableManager get_vars() 13131 1726867201.13710: Calling all_inventory to load vars for managed_node1 13131 1726867201.13714: Calling groups_inventory to load vars for managed_node1 13131 1726867201.13717: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867201.13729: Calling all_plugins_play to load vars for managed_node1 13131 1726867201.13732: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867201.13735: Calling groups_plugins_play to load vars for managed_node1 13131 1726867201.15253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867201.17259: done with get_vars() 13131 1726867201.17283: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:01 -0400 (0:00:00.056) 0:00:16.284 ****** 13131 1726867201.17355: entering _queue_task() for managed_node1/ping 13131 1726867201.17356: Creating lock for ping 13131 1726867201.17590: worker is 1 (out of 1 available) 13131 1726867201.17606: exiting _queue_task() for managed_node1/ping 13131 1726867201.17616: done queuing things up, now waiting for results queue to drain 13131 1726867201.17617: waiting for pending results... 13131 1726867201.17791: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867201.17874: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000003b 13131 1726867201.17889: variable 'ansible_search_path' from source: unknown 13131 1726867201.17895: variable 'ansible_search_path' from source: unknown 13131 1726867201.17921: calling self._execute() 13131 1726867201.17989: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.17996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.18003: variable 'omit' from source: magic vars 13131 1726867201.18282: variable 'ansible_distribution_major_version' from source: facts 13131 1726867201.18287: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867201.18297: variable 'omit' from source: magic vars 13131 1726867201.18336: variable 'omit' from source: magic vars 13131 1726867201.18360: variable 'omit' from source: magic vars 13131 1726867201.18395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867201.18421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867201.18435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867201.18449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867201.18459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867201.18508: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867201.18511: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.18514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.18649: Set connection var ansible_connection to ssh 13131 1726867201.18652: Set connection var ansible_timeout to 10 13131 1726867201.18655: Set connection var ansible_shell_type to sh 13131 1726867201.18658: Set connection var ansible_shell_executable to /bin/sh 13131 1726867201.18660: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867201.18662: Set connection var ansible_pipelining to False 13131 1726867201.18664: variable 'ansible_shell_executable' from source: unknown 13131 1726867201.18666: variable 'ansible_connection' from source: unknown 13131 1726867201.18668: variable 'ansible_module_compression' from source: unknown 13131 1726867201.18670: variable 'ansible_shell_type' from source: unknown 13131 1726867201.18672: variable 'ansible_shell_executable' from source: unknown 13131 1726867201.18674: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.18676: variable 'ansible_pipelining' from source: unknown 13131 1726867201.18680: variable 'ansible_timeout' from source: unknown 13131 1726867201.18682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.18857: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867201.18882: variable 'omit' from source: magic vars 13131 1726867201.18886: starting attempt loop 13131 1726867201.18888: running the handler 13131 1726867201.18906: _low_level_execute_command(): starting 13131 1726867201.19043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867201.19650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.19666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.19680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.19733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.19742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.19815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.21504: stdout chunk (state=3): >>>/root <<< 13131 1726867201.21648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.21672: stderr chunk (state=3): >>><<< 13131 1726867201.21675: stdout chunk (state=3): >>><<< 13131 1726867201.21782: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867201.21786: _low_level_execute_command(): starting 13131 1726867201.21790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552 `" && echo ansible-tmp-1726867201.2169821-14033-57987459029552="` echo /root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552 `" ) && sleep 0' 13131 1726867201.22449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867201.22453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.22491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.22506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867201.22530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.22622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.24505: stdout chunk (state=3): >>>ansible-tmp-1726867201.2169821-14033-57987459029552=/root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552 <<< 13131 1726867201.24667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.24669: stdout chunk (state=3): >>><<< 13131 1726867201.24671: stderr chunk (state=3): >>><<< 13131 1726867201.24782: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867201.2169821-14033-57987459029552=/root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867201.24785: variable 'ansible_module_compression' from source: unknown 13131 1726867201.24787: ANSIBALLZ: Using lock for ping 13131 1726867201.24789: ANSIBALLZ: Acquiring lock 13131 1726867201.24791: ANSIBALLZ: Lock acquired: 140192896452848 13131 1726867201.24802: ANSIBALLZ: Creating module 13131 1726867201.33902: ANSIBALLZ: Writing module into payload 13131 1726867201.33944: ANSIBALLZ: Writing module 13131 1726867201.33966: ANSIBALLZ: Renaming module 13131 1726867201.33976: ANSIBALLZ: Done creating module 13131 1726867201.33998: variable 'ansible_facts' from source: unknown 13131 1726867201.34042: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/AnsiballZ_ping.py 13131 1726867201.34142: Sending initial data 13131 1726867201.34152: Sent initial data (152 bytes) 13131 1726867201.34620: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867201.34655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867201.34658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.34707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.36342: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13131 1726867201.36345: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867201.36385: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867201.36430: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpd83s4acn /root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/AnsiballZ_ping.py <<< 13131 1726867201.36433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/AnsiballZ_ping.py" <<< 13131 1726867201.36490: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpd83s4acn" to remote "/root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/AnsiballZ_ping.py" <<< 13131 1726867201.37009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.37050: stderr chunk (state=3): >>><<< 13131 1726867201.37053: stdout chunk (state=3): >>><<< 13131 1726867201.37073: done transferring module to remote 13131 1726867201.37083: _low_level_execute_command(): starting 13131 1726867201.37088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/ /root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/AnsiballZ_ping.py && sleep 0' 13131 1726867201.37524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867201.37529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.37532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867201.37534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867201.37536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.37586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.37593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.37634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.39412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.39436: stderr chunk (state=3): >>><<< 13131 1726867201.39439: stdout chunk (state=3): >>><<< 13131 1726867201.39453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867201.39456: _low_level_execute_command(): starting 13131 1726867201.39460: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/AnsiballZ_ping.py && sleep 0' 13131 1726867201.39880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.39884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867201.39918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867201.39921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.39924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867201.39926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.39928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.39984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.39990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867201.39992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.40048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.55218: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13131 1726867201.56538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867201.56612: stderr chunk (state=3): >>><<< 13131 1726867201.56616: stdout chunk (state=3): >>><<< 13131 1726867201.56618: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867201.56622: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867201.56624: _low_level_execute_command(): starting 13131 1726867201.56630: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867201.2169821-14033-57987459029552/ > /dev/null 2>&1 && sleep 0' 13131 1726867201.57232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867201.57243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867201.57266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.57269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867201.57372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867201.57375: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867201.57378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.57381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867201.57382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867201.57384: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867201.57386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867201.57388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.57390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867201.57394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867201.57396: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867201.57398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.57439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.57446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867201.57467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.57545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.59683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.59687: stdout chunk (state=3): >>><<< 13131 1726867201.59689: stderr chunk (state=3): >>><<< 13131 1726867201.59886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867201.59890: handler run complete 13131 1726867201.59893: attempt loop complete, returning result 13131 1726867201.59895: _execute() done 13131 1726867201.59897: dumping result to json 13131 1726867201.59899: done dumping result, returning 13131 1726867201.59901: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-5f24-9b7a-00000000003b] 13131 1726867201.59903: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000003b 13131 1726867201.59974: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000003b 13131 1726867201.59980: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 13131 1726867201.60243: no more pending results, returning what we have 13131 1726867201.60247: results queue empty 13131 1726867201.60248: checking for any_errors_fatal 13131 1726867201.60255: done checking for any_errors_fatal 13131 1726867201.60256: checking for max_fail_percentage 13131 1726867201.60258: done checking for max_fail_percentage 13131 1726867201.60259: checking to see if all hosts have failed and the running result is not ok 13131 1726867201.60259: done checking to see if all hosts have failed 13131 1726867201.60260: getting the remaining hosts for this loop 13131 1726867201.60262: done getting the remaining hosts for this loop 13131 1726867201.60266: getting the next task for host managed_node1 13131 1726867201.60279: done getting next task for host managed_node1 13131 1726867201.60282: ^ task is: TASK: meta (role_complete) 13131 1726867201.60285: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867201.60297: getting variables 13131 1726867201.60299: in VariableManager get_vars() 13131 1726867201.60357: Calling all_inventory to load vars for managed_node1 13131 1726867201.60361: Calling groups_inventory to load vars for managed_node1 13131 1726867201.60363: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867201.60375: Calling all_plugins_play to load vars for managed_node1 13131 1726867201.60784: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867201.60789: Calling groups_plugins_play to load vars for managed_node1 13131 1726867201.63184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867201.65729: done with get_vars() 13131 1726867201.65752: done getting variables 13131 1726867201.65837: done queuing things up, now waiting for results queue to drain 13131 1726867201.65839: results queue empty 13131 1726867201.65840: checking for any_errors_fatal 13131 1726867201.65842: done checking for any_errors_fatal 13131 1726867201.65843: checking for max_fail_percentage 13131 1726867201.65844: done checking for max_fail_percentage 13131 1726867201.65845: checking to see if all hosts have failed and the running result is not ok 13131 1726867201.65845: done checking to see if all hosts have failed 13131 1726867201.65846: getting the remaining hosts for this loop 13131 1726867201.65847: done getting the remaining hosts for this loop 13131 1726867201.65850: getting the next task for host managed_node1 13131 1726867201.65855: done getting next task for host managed_node1 13131 1726867201.65857: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13131 1726867201.65859: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867201.65861: getting variables 13131 1726867201.65862: in VariableManager get_vars() 13131 1726867201.65885: Calling all_inventory to load vars for managed_node1 13131 1726867201.65887: Calling groups_inventory to load vars for managed_node1 13131 1726867201.65889: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867201.65894: Calling all_plugins_play to load vars for managed_node1 13131 1726867201.65896: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867201.65899: Calling groups_plugins_play to load vars for managed_node1 13131 1726867201.67015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867201.68522: done with get_vars() 13131 1726867201.68548: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:20:01 -0400 (0:00:00.512) 0:00:16.796 ****** 13131 1726867201.68629: entering _queue_task() for managed_node1/include_tasks 13131 1726867201.68994: worker is 1 (out of 1 available) 13131 1726867201.69007: exiting _queue_task() for managed_node1/include_tasks 13131 1726867201.69021: done queuing things up, now waiting for results queue to drain 13131 1726867201.69022: waiting for pending results... 13131 1726867201.69394: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 13131 1726867201.69399: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000006e 13131 1726867201.69484: variable 'ansible_search_path' from source: unknown 13131 1726867201.69488: variable 'ansible_search_path' from source: unknown 13131 1726867201.69494: calling self._execute() 13131 1726867201.69573: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.69588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.69606: variable 'omit' from source: magic vars 13131 1726867201.70018: variable 'ansible_distribution_major_version' from source: facts 13131 1726867201.70035: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867201.70046: _execute() done 13131 1726867201.70060: dumping result to json 13131 1726867201.70173: done dumping result, returning 13131 1726867201.70179: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-5f24-9b7a-00000000006e] 13131 1726867201.70181: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000006e 13131 1726867201.70252: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000006e 13131 1726867201.70255: WORKER PROCESS EXITING 13131 1726867201.70288: no more pending results, returning what we have 13131 1726867201.70295: in VariableManager get_vars() 13131 1726867201.70355: Calling all_inventory to load vars for managed_node1 13131 1726867201.70358: Calling groups_inventory to load vars for managed_node1 13131 1726867201.70361: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867201.70374: Calling all_plugins_play to load vars for managed_node1 13131 1726867201.70379: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867201.70383: Calling groups_plugins_play to load vars for managed_node1 13131 1726867201.72225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867201.73879: done with get_vars() 13131 1726867201.73909: variable 'ansible_search_path' from source: unknown 13131 1726867201.73911: variable 'ansible_search_path' from source: unknown 13131 1726867201.73954: we have included files to process 13131 1726867201.73956: generating all_blocks data 13131 1726867201.73962: done generating all_blocks data 13131 1726867201.73967: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867201.73968: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867201.73971: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13131 1726867201.74171: done processing included file 13131 1726867201.74178: iterating over new_blocks loaded from include file 13131 1726867201.74181: in VariableManager get_vars() 13131 1726867201.74210: done with get_vars() 13131 1726867201.74212: filtering new block on tags 13131 1726867201.74231: done filtering new block on tags 13131 1726867201.74233: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 13131 1726867201.74238: extending task lists for all hosts with included blocks 13131 1726867201.74343: done extending task lists 13131 1726867201.74344: done processing included files 13131 1726867201.74345: results queue empty 13131 1726867201.74345: checking for any_errors_fatal 13131 1726867201.74347: done checking for any_errors_fatal 13131 1726867201.74348: checking for max_fail_percentage 13131 1726867201.74349: done checking for max_fail_percentage 13131 1726867201.74349: checking to see if all hosts have failed and the running result is not ok 13131 1726867201.74350: done checking to see if all hosts have failed 13131 1726867201.74351: getting the remaining hosts for this loop 13131 1726867201.74352: done getting the remaining hosts for this loop 13131 1726867201.74354: getting the next task for host managed_node1 13131 1726867201.74358: done getting next task for host managed_node1 13131 1726867201.74360: ^ task is: TASK: Get stat for interface {{ interface }} 13131 1726867201.74363: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867201.74366: getting variables 13131 1726867201.74367: in VariableManager get_vars() 13131 1726867201.74391: Calling all_inventory to load vars for managed_node1 13131 1726867201.74394: Calling groups_inventory to load vars for managed_node1 13131 1726867201.74396: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867201.74401: Calling all_plugins_play to load vars for managed_node1 13131 1726867201.74404: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867201.74406: Calling groups_plugins_play to load vars for managed_node1 13131 1726867201.75601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867201.77174: done with get_vars() 13131 1726867201.77208: done getting variables 13131 1726867201.77374: variable 'interface' from source: task vars 13131 1726867201.77380: variable 'controller_device' from source: play vars 13131 1726867201.77447: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:20:01 -0400 (0:00:00.088) 0:00:16.885 ****** 13131 1726867201.77481: entering _queue_task() for managed_node1/stat 13131 1726867201.77854: worker is 1 (out of 1 available) 13131 1726867201.77874: exiting _queue_task() for managed_node1/stat 13131 1726867201.77892: done queuing things up, now waiting for results queue to drain 13131 1726867201.77894: waiting for pending results... 13131 1726867201.78340: running TaskExecutor() for managed_node1/TASK: Get stat for interface nm-bond 13131 1726867201.78439: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000337 13131 1726867201.78443: variable 'ansible_search_path' from source: unknown 13131 1726867201.78445: variable 'ansible_search_path' from source: unknown 13131 1726867201.78453: calling self._execute() 13131 1726867201.78548: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.78554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.78649: variable 'omit' from source: magic vars 13131 1726867201.78930: variable 'ansible_distribution_major_version' from source: facts 13131 1726867201.78949: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867201.78953: variable 'omit' from source: magic vars 13131 1726867201.79015: variable 'omit' from source: magic vars 13131 1726867201.79120: variable 'interface' from source: task vars 13131 1726867201.79124: variable 'controller_device' from source: play vars 13131 1726867201.79202: variable 'controller_device' from source: play vars 13131 1726867201.79222: variable 'omit' from source: magic vars 13131 1726867201.79256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867201.79286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867201.79302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867201.79318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867201.79328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867201.79351: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867201.79354: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.79356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.79434: Set connection var ansible_connection to ssh 13131 1726867201.79440: Set connection var ansible_timeout to 10 13131 1726867201.79443: Set connection var ansible_shell_type to sh 13131 1726867201.79450: Set connection var ansible_shell_executable to /bin/sh 13131 1726867201.79457: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867201.79462: Set connection var ansible_pipelining to False 13131 1726867201.79480: variable 'ansible_shell_executable' from source: unknown 13131 1726867201.79483: variable 'ansible_connection' from source: unknown 13131 1726867201.79486: variable 'ansible_module_compression' from source: unknown 13131 1726867201.79488: variable 'ansible_shell_type' from source: unknown 13131 1726867201.79490: variable 'ansible_shell_executable' from source: unknown 13131 1726867201.79502: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867201.79504: variable 'ansible_pipelining' from source: unknown 13131 1726867201.79507: variable 'ansible_timeout' from source: unknown 13131 1726867201.79509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867201.79654: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867201.79661: variable 'omit' from source: magic vars 13131 1726867201.79666: starting attempt loop 13131 1726867201.79669: running the handler 13131 1726867201.79682: _low_level_execute_command(): starting 13131 1726867201.79690: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867201.80212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.80217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867201.80221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867201.80224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.80257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.80263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867201.80271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.80344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.82047: stdout chunk (state=3): >>>/root <<< 13131 1726867201.82149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.82170: stderr chunk (state=3): >>><<< 13131 1726867201.82173: stdout chunk (state=3): >>><<< 13131 1726867201.82198: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867201.82207: _low_level_execute_command(): starting 13131 1726867201.82213: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754 `" && echo ansible-tmp-1726867201.8219504-14063-101072624107754="` echo /root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754 `" ) && sleep 0' 13131 1726867201.82644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.82648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.82658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867201.82660: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.82663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.82713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.82719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.82761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.84645: stdout chunk (state=3): >>>ansible-tmp-1726867201.8219504-14063-101072624107754=/root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754 <<< 13131 1726867201.84747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.84774: stderr chunk (state=3): >>><<< 13131 1726867201.84788: stdout chunk (state=3): >>><<< 13131 1726867201.84800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867201.8219504-14063-101072624107754=/root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867201.84838: variable 'ansible_module_compression' from source: unknown 13131 1726867201.84882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13131 1726867201.84917: variable 'ansible_facts' from source: unknown 13131 1726867201.84975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/AnsiballZ_stat.py 13131 1726867201.85079: Sending initial data 13131 1726867201.85083: Sent initial data (153 bytes) 13131 1726867201.85518: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867201.85521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867201.85523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867201.85526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867201.85528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.85575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.85588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.85627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.87226: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867201.87276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867201.87327: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmplibnxls9 /root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/AnsiballZ_stat.py <<< 13131 1726867201.87344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/AnsiballZ_stat.py" <<< 13131 1726867201.87389: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmplibnxls9" to remote "/root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/AnsiballZ_stat.py" <<< 13131 1726867201.88103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.88141: stderr chunk (state=3): >>><<< 13131 1726867201.88144: stdout chunk (state=3): >>><<< 13131 1726867201.88169: done transferring module to remote 13131 1726867201.88179: _low_level_execute_command(): starting 13131 1726867201.88184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/ /root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/AnsiballZ_stat.py && sleep 0' 13131 1726867201.88615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.88618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.88621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867201.88623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.88664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.88667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.88722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867201.90492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867201.90513: stderr chunk (state=3): >>><<< 13131 1726867201.90516: stdout chunk (state=3): >>><<< 13131 1726867201.90528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867201.90531: _low_level_execute_command(): starting 13131 1726867201.90536: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/AnsiballZ_stat.py && sleep 0' 13131 1726867201.90937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867201.90964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.90967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867201.90969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867201.90971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867201.91021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867201.91025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867201.91083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.06831: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28566, "dev": 23, "nlink": 1, "atime": 1726867200.615878, "mtime": 1726867200.615878, "ctime": 1726867200.615878, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13131 1726867202.08609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867202.08614: stdout chunk (state=3): >>><<< 13131 1726867202.08616: stderr chunk (state=3): >>><<< 13131 1726867202.08618: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28566, "dev": 23, "nlink": 1, "atime": 1726867200.615878, "mtime": 1726867200.615878, "ctime": 1726867200.615878, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867202.08621: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867202.08623: _low_level_execute_command(): starting 13131 1726867202.08625: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867201.8219504-14063-101072624107754/ > /dev/null 2>&1 && sleep 0' 13131 1726867202.09298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867202.09314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.09347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867202.09360: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867202.09395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.09458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.09501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867202.09519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.09538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.09629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.11551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.11554: stdout chunk (state=3): >>><<< 13131 1726867202.11556: stderr chunk (state=3): >>><<< 13131 1726867202.11572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867202.11782: handler run complete 13131 1726867202.11785: attempt loop complete, returning result 13131 1726867202.11788: _execute() done 13131 1726867202.11790: dumping result to json 13131 1726867202.11794: done dumping result, returning 13131 1726867202.11797: done running TaskExecutor() for managed_node1/TASK: Get stat for interface nm-bond [0affcac9-a3a5-5f24-9b7a-000000000337] 13131 1726867202.11799: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000337 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726867200.615878, "block_size": 4096, "blocks": 0, "ctime": 1726867200.615878, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28566, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726867200.615878, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13131 1726867202.11976: no more pending results, returning what we have 13131 1726867202.11982: results queue empty 13131 1726867202.11983: checking for any_errors_fatal 13131 1726867202.11984: done checking for any_errors_fatal 13131 1726867202.11985: checking for max_fail_percentage 13131 1726867202.11986: done checking for max_fail_percentage 13131 1726867202.11987: checking to see if all hosts have failed and the running result is not ok 13131 1726867202.11988: done checking to see if all hosts have failed 13131 1726867202.11988: getting the remaining hosts for this loop 13131 1726867202.11990: done getting the remaining hosts for this loop 13131 1726867202.11994: getting the next task for host managed_node1 13131 1726867202.12002: done getting next task for host managed_node1 13131 1726867202.12141: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13131 1726867202.12144: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867202.12149: getting variables 13131 1726867202.12150: in VariableManager get_vars() 13131 1726867202.12238: Calling all_inventory to load vars for managed_node1 13131 1726867202.12249: Calling groups_inventory to load vars for managed_node1 13131 1726867202.12252: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.12264: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.12267: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.12271: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.12796: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000337 13131 1726867202.12801: WORKER PROCESS EXITING 13131 1726867202.13624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.14493: done with get_vars() 13131 1726867202.14508: done getting variables 13131 1726867202.14551: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867202.14640: variable 'interface' from source: task vars 13131 1726867202.14643: variable 'controller_device' from source: play vars 13131 1726867202.14686: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:20:02 -0400 (0:00:00.372) 0:00:17.257 ****** 13131 1726867202.14710: entering _queue_task() for managed_node1/assert 13131 1726867202.14943: worker is 1 (out of 1 available) 13131 1726867202.14957: exiting _queue_task() for managed_node1/assert 13131 1726867202.14970: done queuing things up, now waiting for results queue to drain 13131 1726867202.14971: waiting for pending results... 13131 1726867202.15157: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'nm-bond' 13131 1726867202.15275: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000006f 13131 1726867202.15289: variable 'ansible_search_path' from source: unknown 13131 1726867202.15292: variable 'ansible_search_path' from source: unknown 13131 1726867202.15331: calling self._execute() 13131 1726867202.15420: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.15423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.15451: variable 'omit' from source: magic vars 13131 1726867202.15885: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.15890: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.15892: variable 'omit' from source: magic vars 13131 1726867202.15894: variable 'omit' from source: magic vars 13131 1726867202.16014: variable 'interface' from source: task vars 13131 1726867202.16082: variable 'controller_device' from source: play vars 13131 1726867202.16085: variable 'controller_device' from source: play vars 13131 1726867202.16122: variable 'omit' from source: magic vars 13131 1726867202.16165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867202.16208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867202.16244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867202.16286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867202.16290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867202.16324: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867202.16327: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.16339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.16395: Set connection var ansible_connection to ssh 13131 1726867202.16408: Set connection var ansible_timeout to 10 13131 1726867202.16411: Set connection var ansible_shell_type to sh 13131 1726867202.16420: Set connection var ansible_shell_executable to /bin/sh 13131 1726867202.16429: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867202.16435: Set connection var ansible_pipelining to False 13131 1726867202.16454: variable 'ansible_shell_executable' from source: unknown 13131 1726867202.16457: variable 'ansible_connection' from source: unknown 13131 1726867202.16460: variable 'ansible_module_compression' from source: unknown 13131 1726867202.16462: variable 'ansible_shell_type' from source: unknown 13131 1726867202.16465: variable 'ansible_shell_executable' from source: unknown 13131 1726867202.16467: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.16469: variable 'ansible_pipelining' from source: unknown 13131 1726867202.16472: variable 'ansible_timeout' from source: unknown 13131 1726867202.16476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.16580: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867202.16589: variable 'omit' from source: magic vars 13131 1726867202.16596: starting attempt loop 13131 1726867202.16599: running the handler 13131 1726867202.16691: variable 'interface_stat' from source: set_fact 13131 1726867202.16708: Evaluated conditional (interface_stat.stat.exists): True 13131 1726867202.16713: handler run complete 13131 1726867202.16723: attempt loop complete, returning result 13131 1726867202.16725: _execute() done 13131 1726867202.16728: dumping result to json 13131 1726867202.16731: done dumping result, returning 13131 1726867202.16737: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'nm-bond' [0affcac9-a3a5-5f24-9b7a-00000000006f] 13131 1726867202.16741: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000006f 13131 1726867202.16824: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000006f 13131 1726867202.16827: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867202.16918: no more pending results, returning what we have 13131 1726867202.16921: results queue empty 13131 1726867202.16922: checking for any_errors_fatal 13131 1726867202.16929: done checking for any_errors_fatal 13131 1726867202.16929: checking for max_fail_percentage 13131 1726867202.16931: done checking for max_fail_percentage 13131 1726867202.16931: checking to see if all hosts have failed and the running result is not ok 13131 1726867202.16932: done checking to see if all hosts have failed 13131 1726867202.16933: getting the remaining hosts for this loop 13131 1726867202.16934: done getting the remaining hosts for this loop 13131 1726867202.16936: getting the next task for host managed_node1 13131 1726867202.16942: done getting next task for host managed_node1 13131 1726867202.16944: ^ task is: TASK: Include the task 'assert_profile_present.yml' 13131 1726867202.16946: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867202.16949: getting variables 13131 1726867202.16950: in VariableManager get_vars() 13131 1726867202.17001: Calling all_inventory to load vars for managed_node1 13131 1726867202.17004: Calling groups_inventory to load vars for managed_node1 13131 1726867202.17006: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.17015: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.17017: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.17019: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.17775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.18952: done with get_vars() 13131 1726867202.18976: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Friday 20 September 2024 17:20:02 -0400 (0:00:00.044) 0:00:17.301 ****** 13131 1726867202.19126: entering _queue_task() for managed_node1/include_tasks 13131 1726867202.19469: worker is 1 (out of 1 available) 13131 1726867202.19492: exiting _queue_task() for managed_node1/include_tasks 13131 1726867202.19506: done queuing things up, now waiting for results queue to drain 13131 1726867202.19507: waiting for pending results... 13131 1726867202.19774: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 13131 1726867202.19861: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000070 13131 1726867202.19880: variable 'ansible_search_path' from source: unknown 13131 1726867202.19937: variable 'controller_profile' from source: play vars 13131 1726867202.20089: variable 'controller_profile' from source: play vars 13131 1726867202.20103: variable 'port1_profile' from source: play vars 13131 1726867202.20154: variable 'port1_profile' from source: play vars 13131 1726867202.20158: variable 'port2_profile' from source: play vars 13131 1726867202.20208: variable 'port2_profile' from source: play vars 13131 1726867202.20217: variable 'omit' from source: magic vars 13131 1726867202.20320: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.20328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.20338: variable 'omit' from source: magic vars 13131 1726867202.20511: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.20519: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.20540: variable 'item' from source: unknown 13131 1726867202.20587: variable 'item' from source: unknown 13131 1726867202.20699: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.20702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.20705: variable 'omit' from source: magic vars 13131 1726867202.20789: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.20793: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.20816: variable 'item' from source: unknown 13131 1726867202.20861: variable 'item' from source: unknown 13131 1726867202.20922: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.20930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.20944: variable 'omit' from source: magic vars 13131 1726867202.21037: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.21043: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.21063: variable 'item' from source: unknown 13131 1726867202.21108: variable 'item' from source: unknown 13131 1726867202.21167: dumping result to json 13131 1726867202.21170: done dumping result, returning 13131 1726867202.21172: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [0affcac9-a3a5-5f24-9b7a-000000000070] 13131 1726867202.21175: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000070 13131 1726867202.21211: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000070 13131 1726867202.21213: WORKER PROCESS EXITING 13131 1726867202.21239: no more pending results, returning what we have 13131 1726867202.21244: in VariableManager get_vars() 13131 1726867202.21298: Calling all_inventory to load vars for managed_node1 13131 1726867202.21301: Calling groups_inventory to load vars for managed_node1 13131 1726867202.21303: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.21315: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.21317: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.21319: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.25893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.26897: done with get_vars() 13131 1726867202.26915: variable 'ansible_search_path' from source: unknown 13131 1726867202.26928: variable 'ansible_search_path' from source: unknown 13131 1726867202.26935: variable 'ansible_search_path' from source: unknown 13131 1726867202.26940: we have included files to process 13131 1726867202.26941: generating all_blocks data 13131 1726867202.26942: done generating all_blocks data 13131 1726867202.26945: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.26945: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.26947: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.27109: in VariableManager get_vars() 13131 1726867202.27137: done with get_vars() 13131 1726867202.27369: done processing included file 13131 1726867202.27370: iterating over new_blocks loaded from include file 13131 1726867202.27372: in VariableManager get_vars() 13131 1726867202.27397: done with get_vars() 13131 1726867202.27399: filtering new block on tags 13131 1726867202.27418: done filtering new block on tags 13131 1726867202.27420: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0) 13131 1726867202.27424: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.27425: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.27428: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.27524: in VariableManager get_vars() 13131 1726867202.27551: done with get_vars() 13131 1726867202.27737: done processing included file 13131 1726867202.27738: iterating over new_blocks loaded from include file 13131 1726867202.27739: in VariableManager get_vars() 13131 1726867202.27756: done with get_vars() 13131 1726867202.27757: filtering new block on tags 13131 1726867202.27768: done filtering new block on tags 13131 1726867202.27769: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.0) 13131 1726867202.27771: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.27772: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.27774: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13131 1726867202.27868: in VariableManager get_vars() 13131 1726867202.27887: done with get_vars() 13131 1726867202.28031: done processing included file 13131 1726867202.28033: iterating over new_blocks loaded from include file 13131 1726867202.28034: in VariableManager get_vars() 13131 1726867202.28048: done with get_vars() 13131 1726867202.28050: filtering new block on tags 13131 1726867202.28060: done filtering new block on tags 13131 1726867202.28061: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.1) 13131 1726867202.28064: extending task lists for all hosts with included blocks 13131 1726867202.31115: done extending task lists 13131 1726867202.31120: done processing included files 13131 1726867202.31120: results queue empty 13131 1726867202.31121: checking for any_errors_fatal 13131 1726867202.31123: done checking for any_errors_fatal 13131 1726867202.31124: checking for max_fail_percentage 13131 1726867202.31124: done checking for max_fail_percentage 13131 1726867202.31125: checking to see if all hosts have failed and the running result is not ok 13131 1726867202.31125: done checking to see if all hosts have failed 13131 1726867202.31126: getting the remaining hosts for this loop 13131 1726867202.31126: done getting the remaining hosts for this loop 13131 1726867202.31128: getting the next task for host managed_node1 13131 1726867202.31130: done getting next task for host managed_node1 13131 1726867202.31131: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13131 1726867202.31133: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867202.31134: getting variables 13131 1726867202.31135: in VariableManager get_vars() 13131 1726867202.31146: Calling all_inventory to load vars for managed_node1 13131 1726867202.31148: Calling groups_inventory to load vars for managed_node1 13131 1726867202.31149: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.31153: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.31154: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.31156: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.31844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.32705: done with get_vars() 13131 1726867202.32718: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:20:02 -0400 (0:00:00.136) 0:00:17.438 ****** 13131 1726867202.32761: entering _queue_task() for managed_node1/include_tasks 13131 1726867202.33027: worker is 1 (out of 1 available) 13131 1726867202.33039: exiting _queue_task() for managed_node1/include_tasks 13131 1726867202.33052: done queuing things up, now waiting for results queue to drain 13131 1726867202.33053: waiting for pending results... 13131 1726867202.33223: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 13131 1726867202.33291: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000355 13131 1726867202.33299: variable 'ansible_search_path' from source: unknown 13131 1726867202.33302: variable 'ansible_search_path' from source: unknown 13131 1726867202.33332: calling self._execute() 13131 1726867202.33406: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.33412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.33420: variable 'omit' from source: magic vars 13131 1726867202.33697: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.33704: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.33710: _execute() done 13131 1726867202.33715: dumping result to json 13131 1726867202.33719: done dumping result, returning 13131 1726867202.33722: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-5f24-9b7a-000000000355] 13131 1726867202.33725: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000355 13131 1726867202.33812: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000355 13131 1726867202.33815: WORKER PROCESS EXITING 13131 1726867202.33862: no more pending results, returning what we have 13131 1726867202.33866: in VariableManager get_vars() 13131 1726867202.33923: Calling all_inventory to load vars for managed_node1 13131 1726867202.33927: Calling groups_inventory to load vars for managed_node1 13131 1726867202.33929: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.33939: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.33943: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.33946: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.34700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.35554: done with get_vars() 13131 1726867202.35568: variable 'ansible_search_path' from source: unknown 13131 1726867202.35569: variable 'ansible_search_path' from source: unknown 13131 1726867202.35595: we have included files to process 13131 1726867202.35596: generating all_blocks data 13131 1726867202.35597: done generating all_blocks data 13131 1726867202.35598: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867202.35599: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867202.35600: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867202.36556: done processing included file 13131 1726867202.36558: iterating over new_blocks loaded from include file 13131 1726867202.36560: in VariableManager get_vars() 13131 1726867202.36587: done with get_vars() 13131 1726867202.36589: filtering new block on tags 13131 1726867202.36611: done filtering new block on tags 13131 1726867202.36613: in VariableManager get_vars() 13131 1726867202.36636: done with get_vars() 13131 1726867202.36638: filtering new block on tags 13131 1726867202.36657: done filtering new block on tags 13131 1726867202.36659: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 13131 1726867202.36664: extending task lists for all hosts with included blocks 13131 1726867202.36823: done extending task lists 13131 1726867202.36824: done processing included files 13131 1726867202.36825: results queue empty 13131 1726867202.36825: checking for any_errors_fatal 13131 1726867202.36828: done checking for any_errors_fatal 13131 1726867202.36828: checking for max_fail_percentage 13131 1726867202.36829: done checking for max_fail_percentage 13131 1726867202.36829: checking to see if all hosts have failed and the running result is not ok 13131 1726867202.36830: done checking to see if all hosts have failed 13131 1726867202.36830: getting the remaining hosts for this loop 13131 1726867202.36831: done getting the remaining hosts for this loop 13131 1726867202.36832: getting the next task for host managed_node1 13131 1726867202.36835: done getting next task for host managed_node1 13131 1726867202.36836: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13131 1726867202.36840: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867202.36841: getting variables 13131 1726867202.36842: in VariableManager get_vars() 13131 1726867202.36967: Calling all_inventory to load vars for managed_node1 13131 1726867202.36969: Calling groups_inventory to load vars for managed_node1 13131 1726867202.36970: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.36974: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.36975: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.36979: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.37574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.38416: done with get_vars() 13131 1726867202.38432: done getting variables 13131 1726867202.38458: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:20:02 -0400 (0:00:00.057) 0:00:17.495 ****** 13131 1726867202.38476: entering _queue_task() for managed_node1/set_fact 13131 1726867202.38722: worker is 1 (out of 1 available) 13131 1726867202.38736: exiting _queue_task() for managed_node1/set_fact 13131 1726867202.38747: done queuing things up, now waiting for results queue to drain 13131 1726867202.38748: waiting for pending results... 13131 1726867202.39014: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 13131 1726867202.39102: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005e4 13131 1726867202.39151: variable 'ansible_search_path' from source: unknown 13131 1726867202.39155: variable 'ansible_search_path' from source: unknown 13131 1726867202.39197: calling self._execute() 13131 1726867202.39339: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.39360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.39387: variable 'omit' from source: magic vars 13131 1726867202.40112: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.40129: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.40139: variable 'omit' from source: magic vars 13131 1726867202.40196: variable 'omit' from source: magic vars 13131 1726867202.40282: variable 'omit' from source: magic vars 13131 1726867202.40285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867202.40328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867202.40353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867202.40375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867202.40396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867202.40439: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867202.40483: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.40486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.40563: Set connection var ansible_connection to ssh 13131 1726867202.40576: Set connection var ansible_timeout to 10 13131 1726867202.40585: Set connection var ansible_shell_type to sh 13131 1726867202.40600: Set connection var ansible_shell_executable to /bin/sh 13131 1726867202.40613: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867202.40646: Set connection var ansible_pipelining to False 13131 1726867202.40655: variable 'ansible_shell_executable' from source: unknown 13131 1726867202.40662: variable 'ansible_connection' from source: unknown 13131 1726867202.40669: variable 'ansible_module_compression' from source: unknown 13131 1726867202.40676: variable 'ansible_shell_type' from source: unknown 13131 1726867202.40756: variable 'ansible_shell_executable' from source: unknown 13131 1726867202.40759: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.40762: variable 'ansible_pipelining' from source: unknown 13131 1726867202.40764: variable 'ansible_timeout' from source: unknown 13131 1726867202.40766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.40862: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867202.40881: variable 'omit' from source: magic vars 13131 1726867202.40891: starting attempt loop 13131 1726867202.40900: running the handler 13131 1726867202.40916: handler run complete 13131 1726867202.40931: attempt loop complete, returning result 13131 1726867202.40937: _execute() done 13131 1726867202.40944: dumping result to json 13131 1726867202.40953: done dumping result, returning 13131 1726867202.40979: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-5f24-9b7a-0000000005e4] 13131 1726867202.40982: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e4 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13131 1726867202.41117: no more pending results, returning what we have 13131 1726867202.41120: results queue empty 13131 1726867202.41121: checking for any_errors_fatal 13131 1726867202.41122: done checking for any_errors_fatal 13131 1726867202.41123: checking for max_fail_percentage 13131 1726867202.41124: done checking for max_fail_percentage 13131 1726867202.41125: checking to see if all hosts have failed and the running result is not ok 13131 1726867202.41125: done checking to see if all hosts have failed 13131 1726867202.41126: getting the remaining hosts for this loop 13131 1726867202.41127: done getting the remaining hosts for this loop 13131 1726867202.41130: getting the next task for host managed_node1 13131 1726867202.41136: done getting next task for host managed_node1 13131 1726867202.41138: ^ task is: TASK: Stat profile file 13131 1726867202.41143: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867202.41147: getting variables 13131 1726867202.41148: in VariableManager get_vars() 13131 1726867202.41198: Calling all_inventory to load vars for managed_node1 13131 1726867202.41201: Calling groups_inventory to load vars for managed_node1 13131 1726867202.41203: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.41213: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.41215: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.41218: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.41796: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e4 13131 1726867202.41799: WORKER PROCESS EXITING 13131 1726867202.42776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.44535: done with get_vars() 13131 1726867202.44565: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:20:02 -0400 (0:00:00.061) 0:00:17.557 ****** 13131 1726867202.44673: entering _queue_task() for managed_node1/stat 13131 1726867202.45025: worker is 1 (out of 1 available) 13131 1726867202.45038: exiting _queue_task() for managed_node1/stat 13131 1726867202.45050: done queuing things up, now waiting for results queue to drain 13131 1726867202.45051: waiting for pending results... 13131 1726867202.45282: running TaskExecutor() for managed_node1/TASK: Stat profile file 13131 1726867202.45369: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005e5 13131 1726867202.45385: variable 'ansible_search_path' from source: unknown 13131 1726867202.45388: variable 'ansible_search_path' from source: unknown 13131 1726867202.45425: calling self._execute() 13131 1726867202.45569: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.45573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.45576: variable 'omit' from source: magic vars 13131 1726867202.46086: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.46098: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.46106: variable 'omit' from source: magic vars 13131 1726867202.46152: variable 'omit' from source: magic vars 13131 1726867202.46247: variable 'profile' from source: include params 13131 1726867202.46251: variable 'item' from source: include params 13131 1726867202.46330: variable 'item' from source: include params 13131 1726867202.46333: variable 'omit' from source: magic vars 13131 1726867202.46401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867202.46405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867202.46421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867202.46439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867202.46451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867202.46481: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867202.46484: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.46486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.46576: Set connection var ansible_connection to ssh 13131 1726867202.46586: Set connection var ansible_timeout to 10 13131 1726867202.46589: Set connection var ansible_shell_type to sh 13131 1726867202.46598: Set connection var ansible_shell_executable to /bin/sh 13131 1726867202.46610: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867202.46656: Set connection var ansible_pipelining to False 13131 1726867202.46662: variable 'ansible_shell_executable' from source: unknown 13131 1726867202.46667: variable 'ansible_connection' from source: unknown 13131 1726867202.46670: variable 'ansible_module_compression' from source: unknown 13131 1726867202.46672: variable 'ansible_shell_type' from source: unknown 13131 1726867202.46674: variable 'ansible_shell_executable' from source: unknown 13131 1726867202.46676: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.46680: variable 'ansible_pipelining' from source: unknown 13131 1726867202.46683: variable 'ansible_timeout' from source: unknown 13131 1726867202.46684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.46840: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867202.46874: variable 'omit' from source: magic vars 13131 1726867202.46879: starting attempt loop 13131 1726867202.46885: running the handler 13131 1726867202.46888: _low_level_execute_command(): starting 13131 1726867202.46890: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867202.47553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867202.47600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.47604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.47608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867202.47610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867202.47613: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867202.47620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.47634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867202.47643: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867202.47651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867202.47659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.47706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.47710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867202.47712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867202.47714: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867202.47717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.47765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867202.47821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.47824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.47881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.49575: stdout chunk (state=3): >>>/root <<< 13131 1726867202.49758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.49761: stdout chunk (state=3): >>><<< 13131 1726867202.49764: stderr chunk (state=3): >>><<< 13131 1726867202.49790: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867202.49897: _low_level_execute_command(): starting 13131 1726867202.49901: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115 `" && echo ansible-tmp-1726867202.4980364-14093-181524295164115="` echo /root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115 `" ) && sleep 0' 13131 1726867202.50647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.50652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.50706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867202.50758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.50762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.50861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.52861: stdout chunk (state=3): >>>ansible-tmp-1726867202.4980364-14093-181524295164115=/root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115 <<< 13131 1726867202.53026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.53029: stdout chunk (state=3): >>><<< 13131 1726867202.53032: stderr chunk (state=3): >>><<< 13131 1726867202.53084: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867202.4980364-14093-181524295164115=/root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867202.53116: variable 'ansible_module_compression' from source: unknown 13131 1726867202.53194: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13131 1726867202.53239: variable 'ansible_facts' from source: unknown 13131 1726867202.53383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/AnsiballZ_stat.py 13131 1726867202.53502: Sending initial data 13131 1726867202.53518: Sent initial data (153 bytes) 13131 1726867202.54002: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.54008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.54041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867202.54044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.54047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.54049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867202.54051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.54105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867202.54112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.54164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.55875: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867202.55935: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867202.55993: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmps6afrvm8 /root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/AnsiballZ_stat.py <<< 13131 1726867202.55997: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/AnsiballZ_stat.py" <<< 13131 1726867202.56066: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13131 1726867202.56076: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmps6afrvm8" to remote "/root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/AnsiballZ_stat.py" <<< 13131 1726867202.56674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.56684: stderr chunk (state=3): >>><<< 13131 1726867202.56687: stdout chunk (state=3): >>><<< 13131 1726867202.56710: done transferring module to remote 13131 1726867202.56719: _low_level_execute_command(): starting 13131 1726867202.56721: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/ /root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/AnsiballZ_stat.py && sleep 0' 13131 1726867202.57138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.57142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867202.57147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.57150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867202.57152: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.57183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.57196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.57250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.59131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.59150: stderr chunk (state=3): >>><<< 13131 1726867202.59153: stdout chunk (state=3): >>><<< 13131 1726867202.59165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867202.59168: _low_level_execute_command(): starting 13131 1726867202.59172: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/AnsiballZ_stat.py && sleep 0' 13131 1726867202.59582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.59585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.59588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.59590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867202.59594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.59633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.59636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.59697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.75671: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13131 1726867202.77087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867202.77115: stderr chunk (state=3): >>><<< 13131 1726867202.77119: stdout chunk (state=3): >>><<< 13131 1726867202.77135: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867202.77162: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867202.77167: _low_level_execute_command(): starting 13131 1726867202.77174: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867202.4980364-14093-181524295164115/ > /dev/null 2>&1 && sleep 0' 13131 1726867202.77644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.77648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.77655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.77657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867202.77659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.77705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867202.77709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.77770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.79628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.79651: stderr chunk (state=3): >>><<< 13131 1726867202.79654: stdout chunk (state=3): >>><<< 13131 1726867202.79666: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867202.79672: handler run complete 13131 1726867202.79694: attempt loop complete, returning result 13131 1726867202.79698: _execute() done 13131 1726867202.79700: dumping result to json 13131 1726867202.79702: done dumping result, returning 13131 1726867202.79708: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affcac9-a3a5-5f24-9b7a-0000000005e5] 13131 1726867202.79712: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e5 13131 1726867202.79803: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e5 13131 1726867202.79805: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 13131 1726867202.79866: no more pending results, returning what we have 13131 1726867202.79869: results queue empty 13131 1726867202.79869: checking for any_errors_fatal 13131 1726867202.79876: done checking for any_errors_fatal 13131 1726867202.79876: checking for max_fail_percentage 13131 1726867202.79880: done checking for max_fail_percentage 13131 1726867202.79881: checking to see if all hosts have failed and the running result is not ok 13131 1726867202.79882: done checking to see if all hosts have failed 13131 1726867202.79882: getting the remaining hosts for this loop 13131 1726867202.79884: done getting the remaining hosts for this loop 13131 1726867202.79887: getting the next task for host managed_node1 13131 1726867202.79895: done getting next task for host managed_node1 13131 1726867202.79897: ^ task is: TASK: Set NM profile exist flag based on the profile files 13131 1726867202.79902: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867202.79905: getting variables 13131 1726867202.79907: in VariableManager get_vars() 13131 1726867202.79963: Calling all_inventory to load vars for managed_node1 13131 1726867202.79966: Calling groups_inventory to load vars for managed_node1 13131 1726867202.79968: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.79980: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.79983: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.79986: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.80785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.81660: done with get_vars() 13131 1726867202.81675: done getting variables 13131 1726867202.81720: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:20:02 -0400 (0:00:00.370) 0:00:17.928 ****** 13131 1726867202.81741: entering _queue_task() for managed_node1/set_fact 13131 1726867202.81962: worker is 1 (out of 1 available) 13131 1726867202.81974: exiting _queue_task() for managed_node1/set_fact 13131 1726867202.81988: done queuing things up, now waiting for results queue to drain 13131 1726867202.81989: waiting for pending results... 13131 1726867202.82162: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 13131 1726867202.82232: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005e6 13131 1726867202.82243: variable 'ansible_search_path' from source: unknown 13131 1726867202.82247: variable 'ansible_search_path' from source: unknown 13131 1726867202.82274: calling self._execute() 13131 1726867202.82348: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.82353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.82362: variable 'omit' from source: magic vars 13131 1726867202.82639: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.82651: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.82735: variable 'profile_stat' from source: set_fact 13131 1726867202.82745: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867202.82748: when evaluation is False, skipping this task 13131 1726867202.82751: _execute() done 13131 1726867202.82755: dumping result to json 13131 1726867202.82758: done dumping result, returning 13131 1726867202.82771: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-5f24-9b7a-0000000005e6] 13131 1726867202.82774: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e6 13131 1726867202.82846: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e6 13131 1726867202.82849: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867202.82920: no more pending results, returning what we have 13131 1726867202.82923: results queue empty 13131 1726867202.82924: checking for any_errors_fatal 13131 1726867202.82929: done checking for any_errors_fatal 13131 1726867202.82929: checking for max_fail_percentage 13131 1726867202.82931: done checking for max_fail_percentage 13131 1726867202.82931: checking to see if all hosts have failed and the running result is not ok 13131 1726867202.82932: done checking to see if all hosts have failed 13131 1726867202.82933: getting the remaining hosts for this loop 13131 1726867202.82934: done getting the remaining hosts for this loop 13131 1726867202.82937: getting the next task for host managed_node1 13131 1726867202.82942: done getting next task for host managed_node1 13131 1726867202.82944: ^ task is: TASK: Get NM profile info 13131 1726867202.82947: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867202.82950: getting variables 13131 1726867202.82951: in VariableManager get_vars() 13131 1726867202.83001: Calling all_inventory to load vars for managed_node1 13131 1726867202.83004: Calling groups_inventory to load vars for managed_node1 13131 1726867202.83006: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867202.83014: Calling all_plugins_play to load vars for managed_node1 13131 1726867202.83016: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867202.83019: Calling groups_plugins_play to load vars for managed_node1 13131 1726867202.83864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867202.84735: done with get_vars() 13131 1726867202.84749: done getting variables 13131 1726867202.84790: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:20:02 -0400 (0:00:00.030) 0:00:17.958 ****** 13131 1726867202.84812: entering _queue_task() for managed_node1/shell 13131 1726867202.85005: worker is 1 (out of 1 available) 13131 1726867202.85018: exiting _queue_task() for managed_node1/shell 13131 1726867202.85029: done queuing things up, now waiting for results queue to drain 13131 1726867202.85030: waiting for pending results... 13131 1726867202.85198: running TaskExecutor() for managed_node1/TASK: Get NM profile info 13131 1726867202.85264: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005e7 13131 1726867202.85278: variable 'ansible_search_path' from source: unknown 13131 1726867202.85282: variable 'ansible_search_path' from source: unknown 13131 1726867202.85312: calling self._execute() 13131 1726867202.85375: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.85382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.85394: variable 'omit' from source: magic vars 13131 1726867202.85654: variable 'ansible_distribution_major_version' from source: facts 13131 1726867202.85664: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867202.85670: variable 'omit' from source: magic vars 13131 1726867202.85706: variable 'omit' from source: magic vars 13131 1726867202.85775: variable 'profile' from source: include params 13131 1726867202.85781: variable 'item' from source: include params 13131 1726867202.85827: variable 'item' from source: include params 13131 1726867202.85846: variable 'omit' from source: magic vars 13131 1726867202.85876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867202.85905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867202.85919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867202.85934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867202.85951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867202.86031: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867202.86035: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.86038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.86212: Set connection var ansible_connection to ssh 13131 1726867202.86215: Set connection var ansible_timeout to 10 13131 1726867202.86218: Set connection var ansible_shell_type to sh 13131 1726867202.86220: Set connection var ansible_shell_executable to /bin/sh 13131 1726867202.86223: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867202.86225: Set connection var ansible_pipelining to False 13131 1726867202.86228: variable 'ansible_shell_executable' from source: unknown 13131 1726867202.86230: variable 'ansible_connection' from source: unknown 13131 1726867202.86232: variable 'ansible_module_compression' from source: unknown 13131 1726867202.86234: variable 'ansible_shell_type' from source: unknown 13131 1726867202.86236: variable 'ansible_shell_executable' from source: unknown 13131 1726867202.86238: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867202.86240: variable 'ansible_pipelining' from source: unknown 13131 1726867202.86243: variable 'ansible_timeout' from source: unknown 13131 1726867202.86245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867202.86297: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867202.86320: variable 'omit' from source: magic vars 13131 1726867202.86323: starting attempt loop 13131 1726867202.86326: running the handler 13131 1726867202.86328: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867202.86337: _low_level_execute_command(): starting 13131 1726867202.86345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867202.86975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867202.86989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.86996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.87011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867202.87021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867202.87028: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867202.87038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.87056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867202.87059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867202.87067: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867202.87085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.87090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.87101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867202.87104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867202.87116: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867202.87122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.87196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867202.87209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.87227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.87299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.89002: stdout chunk (state=3): >>>/root <<< 13131 1726867202.89230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.89233: stdout chunk (state=3): >>><<< 13131 1726867202.89235: stderr chunk (state=3): >>><<< 13131 1726867202.89240: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867202.89252: _low_level_execute_command(): starting 13131 1726867202.89255: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956 `" && echo ansible-tmp-1726867202.8916218-14116-37817593502956="` echo /root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956 `" ) && sleep 0' 13131 1726867202.89894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867202.89897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.89901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867202.89904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.89907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.89909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.89989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.91890: stdout chunk (state=3): >>>ansible-tmp-1726867202.8916218-14116-37817593502956=/root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956 <<< 13131 1726867202.91990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.92018: stderr chunk (state=3): >>><<< 13131 1726867202.92026: stdout chunk (state=3): >>><<< 13131 1726867202.92048: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867202.8916218-14116-37817593502956=/root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867202.92076: variable 'ansible_module_compression' from source: unknown 13131 1726867202.92123: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867202.92161: variable 'ansible_facts' from source: unknown 13131 1726867202.92215: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/AnsiballZ_command.py 13131 1726867202.92319: Sending initial data 13131 1726867202.92322: Sent initial data (155 bytes) 13131 1726867202.92739: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.92770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867202.92773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867202.92775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.92781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.92783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867202.92785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.92839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867202.92842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.92847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.92893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.94511: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867202.94567: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867202.94609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpgkecfaoi /root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/AnsiballZ_command.py <<< 13131 1726867202.94612: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/AnsiballZ_command.py" <<< 13131 1726867202.94655: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpgkecfaoi" to remote "/root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/AnsiballZ_command.py" <<< 13131 1726867202.95406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.95450: stderr chunk (state=3): >>><<< 13131 1726867202.95609: stdout chunk (state=3): >>><<< 13131 1726867202.95612: done transferring module to remote 13131 1726867202.95614: _low_level_execute_command(): starting 13131 1726867202.95616: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/ /root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/AnsiballZ_command.py && sleep 0' 13131 1726867202.96199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867202.96227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867202.96241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867202.96294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867202.96366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867202.96402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.96437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.96520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867202.98316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867202.98340: stderr chunk (state=3): >>><<< 13131 1726867202.98344: stdout chunk (state=3): >>><<< 13131 1726867202.98360: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867202.98364: _low_level_execute_command(): starting 13131 1726867202.98371: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/AnsiballZ_command.py && sleep 0' 13131 1726867202.98961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867202.98965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867202.99015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867203.16801: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 17:20:03.144491", "end": "2024-09-20 17:20:03.165504", "delta": "0:00:00.021013", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867203.18904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867203.18910: stdout chunk (state=3): >>><<< 13131 1726867203.18913: stderr chunk (state=3): >>><<< 13131 1726867203.18916: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 17:20:03.144491", "end": "2024-09-20 17:20:03.165504", "delta": "0:00:00.021013", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867203.18920: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867203.19013: _low_level_execute_command(): starting 13131 1726867203.19016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867202.8916218-14116-37817593502956/ > /dev/null 2>&1 && sleep 0' 13131 1726867203.19908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867203.19924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.19928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.20014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867203.20061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867203.20111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867203.22183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867203.22186: stdout chunk (state=3): >>><<< 13131 1726867203.22189: stderr chunk (state=3): >>><<< 13131 1726867203.22191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867203.22193: handler run complete 13131 1726867203.22195: Evaluated conditional (False): False 13131 1726867203.22197: attempt loop complete, returning result 13131 1726867203.22199: _execute() done 13131 1726867203.22201: dumping result to json 13131 1726867203.22203: done dumping result, returning 13131 1726867203.22205: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affcac9-a3a5-5f24-9b7a-0000000005e7] 13131 1726867203.22207: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e7 13131 1726867203.22275: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e7 13131 1726867203.22281: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.021013", "end": "2024-09-20 17:20:03.165504", "rc": 0, "start": "2024-09-20 17:20:03.144491" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 13131 1726867203.22362: no more pending results, returning what we have 13131 1726867203.22366: results queue empty 13131 1726867203.22366: checking for any_errors_fatal 13131 1726867203.22373: done checking for any_errors_fatal 13131 1726867203.22373: checking for max_fail_percentage 13131 1726867203.22375: done checking for max_fail_percentage 13131 1726867203.22376: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.22379: done checking to see if all hosts have failed 13131 1726867203.22380: getting the remaining hosts for this loop 13131 1726867203.22402: done getting the remaining hosts for this loop 13131 1726867203.22407: getting the next task for host managed_node1 13131 1726867203.22415: done getting next task for host managed_node1 13131 1726867203.22417: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13131 1726867203.22422: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.22426: getting variables 13131 1726867203.22428: in VariableManager get_vars() 13131 1726867203.22702: Calling all_inventory to load vars for managed_node1 13131 1726867203.22706: Calling groups_inventory to load vars for managed_node1 13131 1726867203.22708: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.22719: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.22722: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.22725: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.24389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.27591: done with get_vars() 13131 1726867203.27625: done getting variables 13131 1726867203.27697: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:20:03 -0400 (0:00:00.429) 0:00:18.387 ****** 13131 1726867203.27736: entering _queue_task() for managed_node1/set_fact 13131 1726867203.28220: worker is 1 (out of 1 available) 13131 1726867203.28233: exiting _queue_task() for managed_node1/set_fact 13131 1726867203.28246: done queuing things up, now waiting for results queue to drain 13131 1726867203.28248: waiting for pending results... 13131 1726867203.28676: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13131 1726867203.28750: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005e8 13131 1726867203.28756: variable 'ansible_search_path' from source: unknown 13131 1726867203.28796: variable 'ansible_search_path' from source: unknown 13131 1726867203.28801: calling self._execute() 13131 1726867203.28899: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.28905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.28909: variable 'omit' from source: magic vars 13131 1726867203.29488: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.29494: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.29497: variable 'nm_profile_exists' from source: set_fact 13131 1726867203.29499: Evaluated conditional (nm_profile_exists.rc == 0): True 13131 1726867203.29502: variable 'omit' from source: magic vars 13131 1726867203.29590: variable 'omit' from source: magic vars 13131 1726867203.29622: variable 'omit' from source: magic vars 13131 1726867203.29748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867203.29929: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867203.29933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867203.29936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.29938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.30017: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867203.30020: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.30023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.30303: Set connection var ansible_connection to ssh 13131 1726867203.30312: Set connection var ansible_timeout to 10 13131 1726867203.30315: Set connection var ansible_shell_type to sh 13131 1726867203.30327: Set connection var ansible_shell_executable to /bin/sh 13131 1726867203.30338: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867203.30343: Set connection var ansible_pipelining to False 13131 1726867203.30366: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.30369: variable 'ansible_connection' from source: unknown 13131 1726867203.30372: variable 'ansible_module_compression' from source: unknown 13131 1726867203.30374: variable 'ansible_shell_type' from source: unknown 13131 1726867203.30378: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.30380: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.30489: variable 'ansible_pipelining' from source: unknown 13131 1726867203.30497: variable 'ansible_timeout' from source: unknown 13131 1726867203.30501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.30705: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867203.30709: variable 'omit' from source: magic vars 13131 1726867203.30711: starting attempt loop 13131 1726867203.30714: running the handler 13131 1726867203.30730: handler run complete 13131 1726867203.30744: attempt loop complete, returning result 13131 1726867203.30756: _execute() done 13131 1726867203.30759: dumping result to json 13131 1726867203.30765: done dumping result, returning 13131 1726867203.30780: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-5f24-9b7a-0000000005e8] 13131 1726867203.30783: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e8 13131 1726867203.30861: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005e8 13131 1726867203.30864: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13131 1726867203.30936: no more pending results, returning what we have 13131 1726867203.30940: results queue empty 13131 1726867203.30941: checking for any_errors_fatal 13131 1726867203.30948: done checking for any_errors_fatal 13131 1726867203.30949: checking for max_fail_percentage 13131 1726867203.30950: done checking for max_fail_percentage 13131 1726867203.30951: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.30952: done checking to see if all hosts have failed 13131 1726867203.30952: getting the remaining hosts for this loop 13131 1726867203.30954: done getting the remaining hosts for this loop 13131 1726867203.30957: getting the next task for host managed_node1 13131 1726867203.30967: done getting next task for host managed_node1 13131 1726867203.30970: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13131 1726867203.30980: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.30984: getting variables 13131 1726867203.30985: in VariableManager get_vars() 13131 1726867203.31054: Calling all_inventory to load vars for managed_node1 13131 1726867203.31057: Calling groups_inventory to load vars for managed_node1 13131 1726867203.31059: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.31070: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.31072: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.31075: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.32421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.34673: done with get_vars() 13131 1726867203.34698: done getting variables 13131 1726867203.34754: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867203.34880: variable 'profile' from source: include params 13131 1726867203.34885: variable 'item' from source: include params 13131 1726867203.34941: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:20:03 -0400 (0:00:00.072) 0:00:18.460 ****** 13131 1726867203.34974: entering _queue_task() for managed_node1/command 13131 1726867203.35292: worker is 1 (out of 1 available) 13131 1726867203.35306: exiting _queue_task() for managed_node1/command 13131 1726867203.35316: done queuing things up, now waiting for results queue to drain 13131 1726867203.35317: waiting for pending results... 13131 1726867203.35602: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 13131 1726867203.35965: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005ea 13131 1726867203.35970: variable 'ansible_search_path' from source: unknown 13131 1726867203.35973: variable 'ansible_search_path' from source: unknown 13131 1726867203.35976: calling self._execute() 13131 1726867203.36045: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.36051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.36060: variable 'omit' from source: magic vars 13131 1726867203.36945: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.36959: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.37176: variable 'profile_stat' from source: set_fact 13131 1726867203.37194: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867203.37198: when evaluation is False, skipping this task 13131 1726867203.37201: _execute() done 13131 1726867203.37203: dumping result to json 13131 1726867203.37206: done dumping result, returning 13131 1726867203.37208: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affcac9-a3a5-5f24-9b7a-0000000005ea] 13131 1726867203.37214: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005ea 13131 1726867203.37423: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005ea 13131 1726867203.37427: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867203.37484: no more pending results, returning what we have 13131 1726867203.37489: results queue empty 13131 1726867203.37490: checking for any_errors_fatal 13131 1726867203.37495: done checking for any_errors_fatal 13131 1726867203.37495: checking for max_fail_percentage 13131 1726867203.37497: done checking for max_fail_percentage 13131 1726867203.37498: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.37498: done checking to see if all hosts have failed 13131 1726867203.37499: getting the remaining hosts for this loop 13131 1726867203.37501: done getting the remaining hosts for this loop 13131 1726867203.37504: getting the next task for host managed_node1 13131 1726867203.37511: done getting next task for host managed_node1 13131 1726867203.37514: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13131 1726867203.37519: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.37523: getting variables 13131 1726867203.37525: in VariableManager get_vars() 13131 1726867203.37584: Calling all_inventory to load vars for managed_node1 13131 1726867203.37588: Calling groups_inventory to load vars for managed_node1 13131 1726867203.37590: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.37603: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.37606: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.37609: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.40535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.43740: done with get_vars() 13131 1726867203.43763: done getting variables 13131 1726867203.44013: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867203.44133: variable 'profile' from source: include params 13131 1726867203.44137: variable 'item' from source: include params 13131 1726867203.44196: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:20:03 -0400 (0:00:00.092) 0:00:18.552 ****** 13131 1726867203.44229: entering _queue_task() for managed_node1/set_fact 13131 1726867203.44652: worker is 1 (out of 1 available) 13131 1726867203.44665: exiting _queue_task() for managed_node1/set_fact 13131 1726867203.44674: done queuing things up, now waiting for results queue to drain 13131 1726867203.44676: waiting for pending results... 13131 1726867203.44935: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 13131 1726867203.44986: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005eb 13131 1726867203.44997: variable 'ansible_search_path' from source: unknown 13131 1726867203.45008: variable 'ansible_search_path' from source: unknown 13131 1726867203.45050: calling self._execute() 13131 1726867203.45149: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.45284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.45287: variable 'omit' from source: magic vars 13131 1726867203.45526: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.45544: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.45664: variable 'profile_stat' from source: set_fact 13131 1726867203.45684: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867203.45691: when evaluation is False, skipping this task 13131 1726867203.45699: _execute() done 13131 1726867203.45706: dumping result to json 13131 1726867203.45713: done dumping result, returning 13131 1726867203.45723: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affcac9-a3a5-5f24-9b7a-0000000005eb] 13131 1726867203.45731: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005eb 13131 1726867203.45983: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005eb 13131 1726867203.45986: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867203.46118: no more pending results, returning what we have 13131 1726867203.46121: results queue empty 13131 1726867203.46122: checking for any_errors_fatal 13131 1726867203.46127: done checking for any_errors_fatal 13131 1726867203.46128: checking for max_fail_percentage 13131 1726867203.46129: done checking for max_fail_percentage 13131 1726867203.46130: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.46131: done checking to see if all hosts have failed 13131 1726867203.46132: getting the remaining hosts for this loop 13131 1726867203.46133: done getting the remaining hosts for this loop 13131 1726867203.46136: getting the next task for host managed_node1 13131 1726867203.46142: done getting next task for host managed_node1 13131 1726867203.46145: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13131 1726867203.46149: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.46152: getting variables 13131 1726867203.46154: in VariableManager get_vars() 13131 1726867203.46211: Calling all_inventory to load vars for managed_node1 13131 1726867203.46214: Calling groups_inventory to load vars for managed_node1 13131 1726867203.46217: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.46226: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.46229: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.46232: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.47850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.49895: done with get_vars() 13131 1726867203.49918: done getting variables 13131 1726867203.49986: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867203.50139: variable 'profile' from source: include params 13131 1726867203.50143: variable 'item' from source: include params 13131 1726867203.50314: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:20:03 -0400 (0:00:00.061) 0:00:18.614 ****** 13131 1726867203.50344: entering _queue_task() for managed_node1/command 13131 1726867203.51072: worker is 1 (out of 1 available) 13131 1726867203.51147: exiting _queue_task() for managed_node1/command 13131 1726867203.51163: done queuing things up, now waiting for results queue to drain 13131 1726867203.51165: waiting for pending results... 13131 1726867203.51353: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 13131 1726867203.51459: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005ec 13131 1726867203.51481: variable 'ansible_search_path' from source: unknown 13131 1726867203.51491: variable 'ansible_search_path' from source: unknown 13131 1726867203.51531: calling self._execute() 13131 1726867203.51618: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.51630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.51645: variable 'omit' from source: magic vars 13131 1726867203.51992: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.52010: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.52129: variable 'profile_stat' from source: set_fact 13131 1726867203.52147: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867203.52155: when evaluation is False, skipping this task 13131 1726867203.52162: _execute() done 13131 1726867203.52169: dumping result to json 13131 1726867203.52175: done dumping result, returning 13131 1726867203.52187: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 [0affcac9-a3a5-5f24-9b7a-0000000005ec] 13131 1726867203.52197: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005ec skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867203.52383: no more pending results, returning what we have 13131 1726867203.52387: results queue empty 13131 1726867203.52388: checking for any_errors_fatal 13131 1726867203.52396: done checking for any_errors_fatal 13131 1726867203.52396: checking for max_fail_percentage 13131 1726867203.52398: done checking for max_fail_percentage 13131 1726867203.52399: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.52399: done checking to see if all hosts have failed 13131 1726867203.52400: getting the remaining hosts for this loop 13131 1726867203.52401: done getting the remaining hosts for this loop 13131 1726867203.52404: getting the next task for host managed_node1 13131 1726867203.52412: done getting next task for host managed_node1 13131 1726867203.52414: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13131 1726867203.52419: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.52423: getting variables 13131 1726867203.52424: in VariableManager get_vars() 13131 1726867203.52590: Calling all_inventory to load vars for managed_node1 13131 1726867203.52596: Calling groups_inventory to load vars for managed_node1 13131 1726867203.52599: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.52609: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.52612: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.52616: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.53135: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005ec 13131 1726867203.53139: WORKER PROCESS EXITING 13131 1726867203.54036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.55389: done with get_vars() 13131 1726867203.55406: done getting variables 13131 1726867203.55447: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867203.55533: variable 'profile' from source: include params 13131 1726867203.55536: variable 'item' from source: include params 13131 1726867203.55573: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:20:03 -0400 (0:00:00.052) 0:00:18.666 ****** 13131 1726867203.55598: entering _queue_task() for managed_node1/set_fact 13131 1726867203.55837: worker is 1 (out of 1 available) 13131 1726867203.55851: exiting _queue_task() for managed_node1/set_fact 13131 1726867203.55862: done queuing things up, now waiting for results queue to drain 13131 1726867203.55864: waiting for pending results... 13131 1726867203.56047: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 13131 1726867203.56130: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000005ed 13131 1726867203.56141: variable 'ansible_search_path' from source: unknown 13131 1726867203.56145: variable 'ansible_search_path' from source: unknown 13131 1726867203.56172: calling self._execute() 13131 1726867203.56242: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.56247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.56269: variable 'omit' from source: magic vars 13131 1726867203.56528: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.56535: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.56618: variable 'profile_stat' from source: set_fact 13131 1726867203.56628: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867203.56633: when evaluation is False, skipping this task 13131 1726867203.56636: _execute() done 13131 1726867203.56638: dumping result to json 13131 1726867203.56641: done dumping result, returning 13131 1726867203.56652: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affcac9-a3a5-5f24-9b7a-0000000005ed] 13131 1726867203.56655: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005ed 13131 1726867203.56732: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000005ed 13131 1726867203.56734: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867203.56806: no more pending results, returning what we have 13131 1726867203.56810: results queue empty 13131 1726867203.56811: checking for any_errors_fatal 13131 1726867203.56815: done checking for any_errors_fatal 13131 1726867203.56815: checking for max_fail_percentage 13131 1726867203.56817: done checking for max_fail_percentage 13131 1726867203.56817: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.56818: done checking to see if all hosts have failed 13131 1726867203.56819: getting the remaining hosts for this loop 13131 1726867203.56820: done getting the remaining hosts for this loop 13131 1726867203.56823: getting the next task for host managed_node1 13131 1726867203.56831: done getting next task for host managed_node1 13131 1726867203.56833: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13131 1726867203.56836: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.56841: getting variables 13131 1726867203.56842: in VariableManager get_vars() 13131 1726867203.56886: Calling all_inventory to load vars for managed_node1 13131 1726867203.56889: Calling groups_inventory to load vars for managed_node1 13131 1726867203.56891: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.56902: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.56904: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.56907: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.58366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.60281: done with get_vars() 13131 1726867203.60298: done getting variables 13131 1726867203.60338: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867203.60416: variable 'profile' from source: include params 13131 1726867203.60419: variable 'item' from source: include params 13131 1726867203.60456: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:20:03 -0400 (0:00:00.048) 0:00:18.715 ****** 13131 1726867203.60479: entering _queue_task() for managed_node1/assert 13131 1726867203.60695: worker is 1 (out of 1 available) 13131 1726867203.60709: exiting _queue_task() for managed_node1/assert 13131 1726867203.60719: done queuing things up, now waiting for results queue to drain 13131 1726867203.60720: waiting for pending results... 13131 1726867203.60888: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' 13131 1726867203.60951: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000356 13131 1726867203.60962: variable 'ansible_search_path' from source: unknown 13131 1726867203.60966: variable 'ansible_search_path' from source: unknown 13131 1726867203.60998: calling self._execute() 13131 1726867203.61069: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.61073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.61080: variable 'omit' from source: magic vars 13131 1726867203.61337: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.61346: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.61352: variable 'omit' from source: magic vars 13131 1726867203.61387: variable 'omit' from source: magic vars 13131 1726867203.61450: variable 'profile' from source: include params 13131 1726867203.61454: variable 'item' from source: include params 13131 1726867203.61522: variable 'item' from source: include params 13131 1726867203.61528: variable 'omit' from source: magic vars 13131 1726867203.61598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867203.61791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867203.61797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867203.61800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.61802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.61805: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867203.61807: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.61808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.61848: Set connection var ansible_connection to ssh 13131 1726867203.61861: Set connection var ansible_timeout to 10 13131 1726867203.61868: Set connection var ansible_shell_type to sh 13131 1726867203.61882: Set connection var ansible_shell_executable to /bin/sh 13131 1726867203.61910: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867203.61920: Set connection var ansible_pipelining to False 13131 1726867203.61943: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.61951: variable 'ansible_connection' from source: unknown 13131 1726867203.61957: variable 'ansible_module_compression' from source: unknown 13131 1726867203.61964: variable 'ansible_shell_type' from source: unknown 13131 1726867203.61970: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.61976: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.61987: variable 'ansible_pipelining' from source: unknown 13131 1726867203.62004: variable 'ansible_timeout' from source: unknown 13131 1726867203.62014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.62156: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867203.62172: variable 'omit' from source: magic vars 13131 1726867203.62185: starting attempt loop 13131 1726867203.62195: running the handler 13131 1726867203.62315: variable 'lsr_net_profile_exists' from source: set_fact 13131 1726867203.62336: Evaluated conditional (lsr_net_profile_exists): True 13131 1726867203.62346: handler run complete 13131 1726867203.62362: attempt loop complete, returning result 13131 1726867203.62368: _execute() done 13131 1726867203.62442: dumping result to json 13131 1726867203.62446: done dumping result, returning 13131 1726867203.62448: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' [0affcac9-a3a5-5f24-9b7a-000000000356] 13131 1726867203.62450: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000356 13131 1726867203.62521: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000356 13131 1726867203.62524: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867203.62594: no more pending results, returning what we have 13131 1726867203.62597: results queue empty 13131 1726867203.62598: checking for any_errors_fatal 13131 1726867203.62603: done checking for any_errors_fatal 13131 1726867203.62604: checking for max_fail_percentage 13131 1726867203.62605: done checking for max_fail_percentage 13131 1726867203.62606: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.62606: done checking to see if all hosts have failed 13131 1726867203.62607: getting the remaining hosts for this loop 13131 1726867203.62608: done getting the remaining hosts for this loop 13131 1726867203.62611: getting the next task for host managed_node1 13131 1726867203.62616: done getting next task for host managed_node1 13131 1726867203.62618: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13131 1726867203.62621: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.62625: getting variables 13131 1726867203.62626: in VariableManager get_vars() 13131 1726867203.62702: Calling all_inventory to load vars for managed_node1 13131 1726867203.62705: Calling groups_inventory to load vars for managed_node1 13131 1726867203.62707: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.62715: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.62717: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.62720: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.63559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.64415: done with get_vars() 13131 1726867203.64429: done getting variables 13131 1726867203.64466: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867203.64540: variable 'profile' from source: include params 13131 1726867203.64543: variable 'item' from source: include params 13131 1726867203.64580: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:20:03 -0400 (0:00:00.041) 0:00:18.756 ****** 13131 1726867203.64606: entering _queue_task() for managed_node1/assert 13131 1726867203.64798: worker is 1 (out of 1 available) 13131 1726867203.64811: exiting _queue_task() for managed_node1/assert 13131 1726867203.64822: done queuing things up, now waiting for results queue to drain 13131 1726867203.64823: waiting for pending results... 13131 1726867203.64983: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' 13131 1726867203.65037: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000357 13131 1726867203.65051: variable 'ansible_search_path' from source: unknown 13131 1726867203.65054: variable 'ansible_search_path' from source: unknown 13131 1726867203.65082: calling self._execute() 13131 1726867203.65147: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.65156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.65164: variable 'omit' from source: magic vars 13131 1726867203.65409: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.65418: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.65424: variable 'omit' from source: magic vars 13131 1726867203.65453: variable 'omit' from source: magic vars 13131 1726867203.65523: variable 'profile' from source: include params 13131 1726867203.65527: variable 'item' from source: include params 13131 1726867203.65570: variable 'item' from source: include params 13131 1726867203.65585: variable 'omit' from source: magic vars 13131 1726867203.65618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867203.65644: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867203.65658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867203.65671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.65683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.65709: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867203.65712: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.65714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.65775: Set connection var ansible_connection to ssh 13131 1726867203.65782: Set connection var ansible_timeout to 10 13131 1726867203.65785: Set connection var ansible_shell_type to sh 13131 1726867203.65795: Set connection var ansible_shell_executable to /bin/sh 13131 1726867203.65801: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867203.65807: Set connection var ansible_pipelining to False 13131 1726867203.65825: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.65828: variable 'ansible_connection' from source: unknown 13131 1726867203.65831: variable 'ansible_module_compression' from source: unknown 13131 1726867203.65833: variable 'ansible_shell_type' from source: unknown 13131 1726867203.65835: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.65838: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.65840: variable 'ansible_pipelining' from source: unknown 13131 1726867203.65842: variable 'ansible_timeout' from source: unknown 13131 1726867203.65846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.65942: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867203.65950: variable 'omit' from source: magic vars 13131 1726867203.65955: starting attempt loop 13131 1726867203.65959: running the handler 13131 1726867203.66033: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13131 1726867203.66037: Evaluated conditional (lsr_net_profile_ansible_managed): True 13131 1726867203.66040: handler run complete 13131 1726867203.66051: attempt loop complete, returning result 13131 1726867203.66054: _execute() done 13131 1726867203.66057: dumping result to json 13131 1726867203.66060: done dumping result, returning 13131 1726867203.66065: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' [0affcac9-a3a5-5f24-9b7a-000000000357] 13131 1726867203.66069: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000357 13131 1726867203.66146: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000357 13131 1726867203.66150: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867203.66201: no more pending results, returning what we have 13131 1726867203.66204: results queue empty 13131 1726867203.66205: checking for any_errors_fatal 13131 1726867203.66208: done checking for any_errors_fatal 13131 1726867203.66209: checking for max_fail_percentage 13131 1726867203.66211: done checking for max_fail_percentage 13131 1726867203.66212: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.66212: done checking to see if all hosts have failed 13131 1726867203.66213: getting the remaining hosts for this loop 13131 1726867203.66214: done getting the remaining hosts for this loop 13131 1726867203.66217: getting the next task for host managed_node1 13131 1726867203.66222: done getting next task for host managed_node1 13131 1726867203.66224: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13131 1726867203.66227: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.66230: getting variables 13131 1726867203.66231: in VariableManager get_vars() 13131 1726867203.66280: Calling all_inventory to load vars for managed_node1 13131 1726867203.66283: Calling groups_inventory to load vars for managed_node1 13131 1726867203.66285: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.66296: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.66298: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.66301: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.67028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.68167: done with get_vars() 13131 1726867203.68195: done getting variables 13131 1726867203.68250: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867203.68361: variable 'profile' from source: include params 13131 1726867203.68365: variable 'item' from source: include params 13131 1726867203.68434: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:20:03 -0400 (0:00:00.038) 0:00:18.795 ****** 13131 1726867203.68470: entering _queue_task() for managed_node1/assert 13131 1726867203.68762: worker is 1 (out of 1 available) 13131 1726867203.68773: exiting _queue_task() for managed_node1/assert 13131 1726867203.68787: done queuing things up, now waiting for results queue to drain 13131 1726867203.68789: waiting for pending results... 13131 1726867203.69198: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 13131 1726867203.69204: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000358 13131 1726867203.69210: variable 'ansible_search_path' from source: unknown 13131 1726867203.69214: variable 'ansible_search_path' from source: unknown 13131 1726867203.69243: calling self._execute() 13131 1726867203.69343: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.69355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.69366: variable 'omit' from source: magic vars 13131 1726867203.69731: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.69918: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.69933: variable 'omit' from source: magic vars 13131 1726867203.69983: variable 'omit' from source: magic vars 13131 1726867203.70098: variable 'profile' from source: include params 13131 1726867203.70111: variable 'item' from source: include params 13131 1726867203.70181: variable 'item' from source: include params 13131 1726867203.70211: variable 'omit' from source: magic vars 13131 1726867203.70258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867203.70306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867203.70482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867203.70486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.70488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.70491: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867203.70497: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.70499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.70528: Set connection var ansible_connection to ssh 13131 1726867203.70543: Set connection var ansible_timeout to 10 13131 1726867203.70553: Set connection var ansible_shell_type to sh 13131 1726867203.70567: Set connection var ansible_shell_executable to /bin/sh 13131 1726867203.70584: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867203.70600: Set connection var ansible_pipelining to False 13131 1726867203.70627: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.70637: variable 'ansible_connection' from source: unknown 13131 1726867203.70645: variable 'ansible_module_compression' from source: unknown 13131 1726867203.70653: variable 'ansible_shell_type' from source: unknown 13131 1726867203.70661: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.70668: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.70678: variable 'ansible_pipelining' from source: unknown 13131 1726867203.70688: variable 'ansible_timeout' from source: unknown 13131 1726867203.70701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.70947: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867203.70950: variable 'omit' from source: magic vars 13131 1726867203.70953: starting attempt loop 13131 1726867203.70955: running the handler 13131 1726867203.71139: variable 'lsr_net_profile_fingerprint' from source: set_fact 13131 1726867203.71150: Evaluated conditional (lsr_net_profile_fingerprint): True 13131 1726867203.71161: handler run complete 13131 1726867203.71181: attempt loop complete, returning result 13131 1726867203.71189: _execute() done 13131 1726867203.71200: dumping result to json 13131 1726867203.71208: done dumping result, returning 13131 1726867203.71219: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 [0affcac9-a3a5-5f24-9b7a-000000000358] 13131 1726867203.71228: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000358 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867203.71386: no more pending results, returning what we have 13131 1726867203.71389: results queue empty 13131 1726867203.71390: checking for any_errors_fatal 13131 1726867203.71394: done checking for any_errors_fatal 13131 1726867203.71394: checking for max_fail_percentage 13131 1726867203.71396: done checking for max_fail_percentage 13131 1726867203.71397: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.71397: done checking to see if all hosts have failed 13131 1726867203.71398: getting the remaining hosts for this loop 13131 1726867203.71399: done getting the remaining hosts for this loop 13131 1726867203.71402: getting the next task for host managed_node1 13131 1726867203.71410: done getting next task for host managed_node1 13131 1726867203.71413: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13131 1726867203.71417: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.71421: getting variables 13131 1726867203.71422: in VariableManager get_vars() 13131 1726867203.71484: Calling all_inventory to load vars for managed_node1 13131 1726867203.71487: Calling groups_inventory to load vars for managed_node1 13131 1726867203.71490: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.71499: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.71502: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.71505: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.72090: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000358 13131 1726867203.72096: WORKER PROCESS EXITING 13131 1726867203.72811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.73665: done with get_vars() 13131 1726867203.73682: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:20:03 -0400 (0:00:00.052) 0:00:18.848 ****** 13131 1726867203.73746: entering _queue_task() for managed_node1/include_tasks 13131 1726867203.73968: worker is 1 (out of 1 available) 13131 1726867203.73982: exiting _queue_task() for managed_node1/include_tasks 13131 1726867203.73995: done queuing things up, now waiting for results queue to drain 13131 1726867203.73996: waiting for pending results... 13131 1726867203.74159: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 13131 1726867203.74239: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000035c 13131 1726867203.74251: variable 'ansible_search_path' from source: unknown 13131 1726867203.74254: variable 'ansible_search_path' from source: unknown 13131 1726867203.74284: calling self._execute() 13131 1726867203.74354: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.74358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.74367: variable 'omit' from source: magic vars 13131 1726867203.74630: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.74640: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.74646: _execute() done 13131 1726867203.74648: dumping result to json 13131 1726867203.74651: done dumping result, returning 13131 1726867203.74663: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-5f24-9b7a-00000000035c] 13131 1726867203.74665: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000035c 13131 1726867203.74746: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000035c 13131 1726867203.74748: WORKER PROCESS EXITING 13131 1726867203.74789: no more pending results, returning what we have 13131 1726867203.74794: in VariableManager get_vars() 13131 1726867203.74847: Calling all_inventory to load vars for managed_node1 13131 1726867203.74850: Calling groups_inventory to load vars for managed_node1 13131 1726867203.74852: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.74862: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.74865: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.74867: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.75732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.76566: done with get_vars() 13131 1726867203.76583: variable 'ansible_search_path' from source: unknown 13131 1726867203.76584: variable 'ansible_search_path' from source: unknown 13131 1726867203.76608: we have included files to process 13131 1726867203.76608: generating all_blocks data 13131 1726867203.76610: done generating all_blocks data 13131 1726867203.76613: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867203.76613: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867203.76615: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867203.77180: done processing included file 13131 1726867203.77182: iterating over new_blocks loaded from include file 13131 1726867203.77183: in VariableManager get_vars() 13131 1726867203.77200: done with get_vars() 13131 1726867203.77201: filtering new block on tags 13131 1726867203.77215: done filtering new block on tags 13131 1726867203.77216: in VariableManager get_vars() 13131 1726867203.77233: done with get_vars() 13131 1726867203.77234: filtering new block on tags 13131 1726867203.77247: done filtering new block on tags 13131 1726867203.77248: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 13131 1726867203.77252: extending task lists for all hosts with included blocks 13131 1726867203.77349: done extending task lists 13131 1726867203.77350: done processing included files 13131 1726867203.77350: results queue empty 13131 1726867203.77351: checking for any_errors_fatal 13131 1726867203.77352: done checking for any_errors_fatal 13131 1726867203.77353: checking for max_fail_percentage 13131 1726867203.77354: done checking for max_fail_percentage 13131 1726867203.77354: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.77355: done checking to see if all hosts have failed 13131 1726867203.77355: getting the remaining hosts for this loop 13131 1726867203.77356: done getting the remaining hosts for this loop 13131 1726867203.77357: getting the next task for host managed_node1 13131 1726867203.77360: done getting next task for host managed_node1 13131 1726867203.77361: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13131 1726867203.77363: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.77365: getting variables 13131 1726867203.77365: in VariableManager get_vars() 13131 1726867203.77379: Calling all_inventory to load vars for managed_node1 13131 1726867203.77382: Calling groups_inventory to load vars for managed_node1 13131 1726867203.77384: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.77388: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.77390: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.77391: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.78035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.78866: done with get_vars() 13131 1726867203.78881: done getting variables 13131 1726867203.78908: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:20:03 -0400 (0:00:00.051) 0:00:18.899 ****** 13131 1726867203.78927: entering _queue_task() for managed_node1/set_fact 13131 1726867203.79139: worker is 1 (out of 1 available) 13131 1726867203.79153: exiting _queue_task() for managed_node1/set_fact 13131 1726867203.79165: done queuing things up, now waiting for results queue to drain 13131 1726867203.79166: waiting for pending results... 13131 1726867203.79333: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 13131 1726867203.79404: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000062c 13131 1726867203.79416: variable 'ansible_search_path' from source: unknown 13131 1726867203.79419: variable 'ansible_search_path' from source: unknown 13131 1726867203.79449: calling self._execute() 13131 1726867203.79520: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.79526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.79533: variable 'omit' from source: magic vars 13131 1726867203.79808: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.79817: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.79825: variable 'omit' from source: magic vars 13131 1726867203.79855: variable 'omit' from source: magic vars 13131 1726867203.79881: variable 'omit' from source: magic vars 13131 1726867203.79912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867203.79943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867203.79955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867203.79968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.79979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.80002: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867203.80006: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.80008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.80076: Set connection var ansible_connection to ssh 13131 1726867203.80085: Set connection var ansible_timeout to 10 13131 1726867203.80088: Set connection var ansible_shell_type to sh 13131 1726867203.80096: Set connection var ansible_shell_executable to /bin/sh 13131 1726867203.80103: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867203.80108: Set connection var ansible_pipelining to False 13131 1726867203.80124: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.80127: variable 'ansible_connection' from source: unknown 13131 1726867203.80130: variable 'ansible_module_compression' from source: unknown 13131 1726867203.80132: variable 'ansible_shell_type' from source: unknown 13131 1726867203.80134: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.80136: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.80139: variable 'ansible_pipelining' from source: unknown 13131 1726867203.80142: variable 'ansible_timeout' from source: unknown 13131 1726867203.80146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.80242: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867203.80250: variable 'omit' from source: magic vars 13131 1726867203.80255: starting attempt loop 13131 1726867203.80258: running the handler 13131 1726867203.80271: handler run complete 13131 1726867203.80280: attempt loop complete, returning result 13131 1726867203.80283: _execute() done 13131 1726867203.80285: dumping result to json 13131 1726867203.80288: done dumping result, returning 13131 1726867203.80297: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-5f24-9b7a-00000000062c] 13131 1726867203.80299: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000062c 13131 1726867203.80368: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000062c 13131 1726867203.80373: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13131 1726867203.80426: no more pending results, returning what we have 13131 1726867203.80430: results queue empty 13131 1726867203.80431: checking for any_errors_fatal 13131 1726867203.80432: done checking for any_errors_fatal 13131 1726867203.80433: checking for max_fail_percentage 13131 1726867203.80434: done checking for max_fail_percentage 13131 1726867203.80435: checking to see if all hosts have failed and the running result is not ok 13131 1726867203.80435: done checking to see if all hosts have failed 13131 1726867203.80436: getting the remaining hosts for this loop 13131 1726867203.80438: done getting the remaining hosts for this loop 13131 1726867203.80441: getting the next task for host managed_node1 13131 1726867203.80447: done getting next task for host managed_node1 13131 1726867203.80449: ^ task is: TASK: Stat profile file 13131 1726867203.80452: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867203.80455: getting variables 13131 1726867203.80456: in VariableManager get_vars() 13131 1726867203.80500: Calling all_inventory to load vars for managed_node1 13131 1726867203.80502: Calling groups_inventory to load vars for managed_node1 13131 1726867203.80505: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867203.80512: Calling all_plugins_play to load vars for managed_node1 13131 1726867203.80514: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867203.80517: Calling groups_plugins_play to load vars for managed_node1 13131 1726867203.84134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867203.84973: done with get_vars() 13131 1726867203.84989: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:20:03 -0400 (0:00:00.061) 0:00:18.961 ****** 13131 1726867203.85044: entering _queue_task() for managed_node1/stat 13131 1726867203.85283: worker is 1 (out of 1 available) 13131 1726867203.85298: exiting _queue_task() for managed_node1/stat 13131 1726867203.85309: done queuing things up, now waiting for results queue to drain 13131 1726867203.85310: waiting for pending results... 13131 1726867203.85473: running TaskExecutor() for managed_node1/TASK: Stat profile file 13131 1726867203.85544: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000062d 13131 1726867203.85557: variable 'ansible_search_path' from source: unknown 13131 1726867203.85560: variable 'ansible_search_path' from source: unknown 13131 1726867203.85590: calling self._execute() 13131 1726867203.85657: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.85664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.85673: variable 'omit' from source: magic vars 13131 1726867203.85943: variable 'ansible_distribution_major_version' from source: facts 13131 1726867203.85953: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867203.85960: variable 'omit' from source: magic vars 13131 1726867203.85998: variable 'omit' from source: magic vars 13131 1726867203.86064: variable 'profile' from source: include params 13131 1726867203.86068: variable 'item' from source: include params 13131 1726867203.86119: variable 'item' from source: include params 13131 1726867203.86134: variable 'omit' from source: magic vars 13131 1726867203.86163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867203.86195: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867203.86209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867203.86224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.86233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867203.86254: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867203.86257: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.86260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.86328: Set connection var ansible_connection to ssh 13131 1726867203.86336: Set connection var ansible_timeout to 10 13131 1726867203.86338: Set connection var ansible_shell_type to sh 13131 1726867203.86344: Set connection var ansible_shell_executable to /bin/sh 13131 1726867203.86352: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867203.86356: Set connection var ansible_pipelining to False 13131 1726867203.86373: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.86375: variable 'ansible_connection' from source: unknown 13131 1726867203.86380: variable 'ansible_module_compression' from source: unknown 13131 1726867203.86382: variable 'ansible_shell_type' from source: unknown 13131 1726867203.86384: variable 'ansible_shell_executable' from source: unknown 13131 1726867203.86387: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867203.86389: variable 'ansible_pipelining' from source: unknown 13131 1726867203.86396: variable 'ansible_timeout' from source: unknown 13131 1726867203.86399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867203.86537: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867203.86545: variable 'omit' from source: magic vars 13131 1726867203.86552: starting attempt loop 13131 1726867203.86556: running the handler 13131 1726867203.86566: _low_level_execute_command(): starting 13131 1726867203.86573: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867203.87067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867203.87104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.87108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867203.87111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.87151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867203.87159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867203.87171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867203.87234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867203.88912: stdout chunk (state=3): >>>/root <<< 13131 1726867203.89011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867203.89038: stderr chunk (state=3): >>><<< 13131 1726867203.89041: stdout chunk (state=3): >>><<< 13131 1726867203.89062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867203.89073: _low_level_execute_command(): starting 13131 1726867203.89079: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882 `" && echo ansible-tmp-1726867203.8906121-14176-21204818046882="` echo /root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882 `" ) && sleep 0' 13131 1726867203.89476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867203.89512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867203.89515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.89524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867203.89527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867203.89528: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.89573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867203.89580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867203.89623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867203.91507: stdout chunk (state=3): >>>ansible-tmp-1726867203.8906121-14176-21204818046882=/root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882 <<< 13131 1726867203.91621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867203.91640: stderr chunk (state=3): >>><<< 13131 1726867203.91644: stdout chunk (state=3): >>><<< 13131 1726867203.91657: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867203.8906121-14176-21204818046882=/root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867203.91691: variable 'ansible_module_compression' from source: unknown 13131 1726867203.91738: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13131 1726867203.91768: variable 'ansible_facts' from source: unknown 13131 1726867203.91824: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/AnsiballZ_stat.py 13131 1726867203.91915: Sending initial data 13131 1726867203.91919: Sent initial data (152 bytes) 13131 1726867203.92347: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867203.92350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867203.92353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867203.92355: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867203.92357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.92409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867203.92415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867203.92460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867203.93998: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 13131 1726867203.94001: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867203.94043: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867203.94088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmplkjnxqg7 /root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/AnsiballZ_stat.py <<< 13131 1726867203.94095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/AnsiballZ_stat.py" <<< 13131 1726867203.94132: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmplkjnxqg7" to remote "/root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/AnsiballZ_stat.py" <<< 13131 1726867203.94696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867203.94707: stderr chunk (state=3): >>><<< 13131 1726867203.94711: stdout chunk (state=3): >>><<< 13131 1726867203.94727: done transferring module to remote 13131 1726867203.94735: _low_level_execute_command(): starting 13131 1726867203.94739: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/ /root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/AnsiballZ_stat.py && sleep 0' 13131 1726867203.95150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867203.95154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.95156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867203.95158: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867203.95163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.95213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867203.95217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867203.95221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867203.95264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867203.97134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867203.97154: stderr chunk (state=3): >>><<< 13131 1726867203.97157: stdout chunk (state=3): >>><<< 13131 1726867203.97168: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867203.97171: _low_level_execute_command(): starting 13131 1726867203.97175: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/AnsiballZ_stat.py && sleep 0' 13131 1726867203.97551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867203.97582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867203.97585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867203.97588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.97590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867203.97592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867203.97642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867203.97648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867203.97703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867204.12857: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13131 1726867204.14109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867204.14135: stderr chunk (state=3): >>><<< 13131 1726867204.14139: stdout chunk (state=3): >>><<< 13131 1726867204.14155: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867204.14180: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867204.14190: _low_level_execute_command(): starting 13131 1726867204.14193: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867203.8906121-14176-21204818046882/ > /dev/null 2>&1 && sleep 0' 13131 1726867204.14643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867204.14647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867204.14649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.14651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867204.14658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.14695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867204.14699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867204.14765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867204.16566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867204.16608: stderr chunk (state=3): >>><<< 13131 1726867204.16612: stdout chunk (state=3): >>><<< 13131 1726867204.16623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867204.16629: handler run complete 13131 1726867204.16643: attempt loop complete, returning result 13131 1726867204.16645: _execute() done 13131 1726867204.16648: dumping result to json 13131 1726867204.16652: done dumping result, returning 13131 1726867204.16659: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affcac9-a3a5-5f24-9b7a-00000000062d] 13131 1726867204.16662: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000062d 13131 1726867204.16749: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000062d 13131 1726867204.16752: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 13131 1726867204.16812: no more pending results, returning what we have 13131 1726867204.16815: results queue empty 13131 1726867204.16816: checking for any_errors_fatal 13131 1726867204.16823: done checking for any_errors_fatal 13131 1726867204.16824: checking for max_fail_percentage 13131 1726867204.16825: done checking for max_fail_percentage 13131 1726867204.16826: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.16827: done checking to see if all hosts have failed 13131 1726867204.16827: getting the remaining hosts for this loop 13131 1726867204.16828: done getting the remaining hosts for this loop 13131 1726867204.16832: getting the next task for host managed_node1 13131 1726867204.16838: done getting next task for host managed_node1 13131 1726867204.16840: ^ task is: TASK: Set NM profile exist flag based on the profile files 13131 1726867204.16844: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.16848: getting variables 13131 1726867204.16849: in VariableManager get_vars() 13131 1726867204.16905: Calling all_inventory to load vars for managed_node1 13131 1726867204.16908: Calling groups_inventory to load vars for managed_node1 13131 1726867204.16911: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.16922: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.16924: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.16927: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.17726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.18598: done with get_vars() 13131 1726867204.18615: done getting variables 13131 1726867204.18656: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:20:04 -0400 (0:00:00.336) 0:00:19.297 ****** 13131 1726867204.18680: entering _queue_task() for managed_node1/set_fact 13131 1726867204.18898: worker is 1 (out of 1 available) 13131 1726867204.18911: exiting _queue_task() for managed_node1/set_fact 13131 1726867204.18922: done queuing things up, now waiting for results queue to drain 13131 1726867204.18924: waiting for pending results... 13131 1726867204.19098: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 13131 1726867204.19176: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000062e 13131 1726867204.19188: variable 'ansible_search_path' from source: unknown 13131 1726867204.19192: variable 'ansible_search_path' from source: unknown 13131 1726867204.19223: calling self._execute() 13131 1726867204.19294: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.19302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.19311: variable 'omit' from source: magic vars 13131 1726867204.19592: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.19606: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.19689: variable 'profile_stat' from source: set_fact 13131 1726867204.19703: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867204.19706: when evaluation is False, skipping this task 13131 1726867204.19709: _execute() done 13131 1726867204.19711: dumping result to json 13131 1726867204.19714: done dumping result, returning 13131 1726867204.19720: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-5f24-9b7a-00000000062e] 13131 1726867204.19724: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000062e 13131 1726867204.19807: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000062e 13131 1726867204.19810: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867204.19854: no more pending results, returning what we have 13131 1726867204.19858: results queue empty 13131 1726867204.19859: checking for any_errors_fatal 13131 1726867204.19866: done checking for any_errors_fatal 13131 1726867204.19867: checking for max_fail_percentage 13131 1726867204.19868: done checking for max_fail_percentage 13131 1726867204.19869: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.19870: done checking to see if all hosts have failed 13131 1726867204.19870: getting the remaining hosts for this loop 13131 1726867204.19872: done getting the remaining hosts for this loop 13131 1726867204.19875: getting the next task for host managed_node1 13131 1726867204.19883: done getting next task for host managed_node1 13131 1726867204.19886: ^ task is: TASK: Get NM profile info 13131 1726867204.19890: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.19893: getting variables 13131 1726867204.19895: in VariableManager get_vars() 13131 1726867204.19939: Calling all_inventory to load vars for managed_node1 13131 1726867204.19941: Calling groups_inventory to load vars for managed_node1 13131 1726867204.19943: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.19952: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.19954: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.19957: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.20794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.21667: done with get_vars() 13131 1726867204.21682: done getting variables 13131 1726867204.21725: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:20:04 -0400 (0:00:00.030) 0:00:19.328 ****** 13131 1726867204.21745: entering _queue_task() for managed_node1/shell 13131 1726867204.22010: worker is 1 (out of 1 available) 13131 1726867204.22022: exiting _queue_task() for managed_node1/shell 13131 1726867204.22033: done queuing things up, now waiting for results queue to drain 13131 1726867204.22035: waiting for pending results... 13131 1726867204.22402: running TaskExecutor() for managed_node1/TASK: Get NM profile info 13131 1726867204.22437: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000062f 13131 1726867204.22456: variable 'ansible_search_path' from source: unknown 13131 1726867204.22463: variable 'ansible_search_path' from source: unknown 13131 1726867204.22507: calling self._execute() 13131 1726867204.22604: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.22683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.22686: variable 'omit' from source: magic vars 13131 1726867204.22961: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.22978: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.22982: variable 'omit' from source: magic vars 13131 1726867204.23023: variable 'omit' from source: magic vars 13131 1726867204.23099: variable 'profile' from source: include params 13131 1726867204.23102: variable 'item' from source: include params 13131 1726867204.23150: variable 'item' from source: include params 13131 1726867204.23165: variable 'omit' from source: magic vars 13131 1726867204.23199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867204.23226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867204.23241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867204.23256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.23267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.23295: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867204.23298: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.23301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.23364: Set connection var ansible_connection to ssh 13131 1726867204.23369: Set connection var ansible_timeout to 10 13131 1726867204.23372: Set connection var ansible_shell_type to sh 13131 1726867204.23381: Set connection var ansible_shell_executable to /bin/sh 13131 1726867204.23399: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867204.23402: Set connection var ansible_pipelining to False 13131 1726867204.23427: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.23430: variable 'ansible_connection' from source: unknown 13131 1726867204.23433: variable 'ansible_module_compression' from source: unknown 13131 1726867204.23436: variable 'ansible_shell_type' from source: unknown 13131 1726867204.23438: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.23441: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.23443: variable 'ansible_pipelining' from source: unknown 13131 1726867204.23445: variable 'ansible_timeout' from source: unknown 13131 1726867204.23448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.23547: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867204.23556: variable 'omit' from source: magic vars 13131 1726867204.23562: starting attempt loop 13131 1726867204.23565: running the handler 13131 1726867204.23573: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867204.23591: _low_level_execute_command(): starting 13131 1726867204.23598: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867204.24095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867204.24100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867204.24103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.24156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867204.24159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867204.24164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867204.24214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867204.25817: stdout chunk (state=3): >>>/root <<< 13131 1726867204.25909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867204.25937: stderr chunk (state=3): >>><<< 13131 1726867204.25940: stdout chunk (state=3): >>><<< 13131 1726867204.25964: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867204.25976: _low_level_execute_command(): starting 13131 1726867204.25983: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804 `" && echo ansible-tmp-1726867204.2596362-14186-66233609359804="` echo /root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804 `" ) && sleep 0' 13131 1726867204.26405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867204.26416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867204.26418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867204.26422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.26454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867204.26472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867204.26551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867204.28445: stdout chunk (state=3): >>>ansible-tmp-1726867204.2596362-14186-66233609359804=/root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804 <<< 13131 1726867204.28558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867204.28579: stderr chunk (state=3): >>><<< 13131 1726867204.28582: stdout chunk (state=3): >>><<< 13131 1726867204.28599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867204.2596362-14186-66233609359804=/root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867204.28623: variable 'ansible_module_compression' from source: unknown 13131 1726867204.28661: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867204.28690: variable 'ansible_facts' from source: unknown 13131 1726867204.28739: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/AnsiballZ_command.py 13131 1726867204.28832: Sending initial data 13131 1726867204.28835: Sent initial data (155 bytes) 13131 1726867204.29251: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867204.29254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867204.29256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.29258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867204.29262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.29311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867204.29318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867204.29360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867204.30872: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867204.30880: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867204.30913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867204.30958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp81zxqs91 /root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/AnsiballZ_command.py <<< 13131 1726867204.30964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/AnsiballZ_command.py" <<< 13131 1726867204.31003: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp81zxqs91" to remote "/root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/AnsiballZ_command.py" <<< 13131 1726867204.31544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867204.31568: stderr chunk (state=3): >>><<< 13131 1726867204.31572: stdout chunk (state=3): >>><<< 13131 1726867204.31615: done transferring module to remote 13131 1726867204.31622: _low_level_execute_command(): starting 13131 1726867204.31627: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/ /root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/AnsiballZ_command.py && sleep 0' 13131 1726867204.32184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867204.32192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867204.32205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.32222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867204.32250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867204.33961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867204.33985: stderr chunk (state=3): >>><<< 13131 1726867204.33988: stdout chunk (state=3): >>><<< 13131 1726867204.34001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867204.34004: _low_level_execute_command(): starting 13131 1726867204.34007: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/AnsiballZ_command.py && sleep 0' 13131 1726867204.34406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867204.34409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.34411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867204.34413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867204.34415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.34463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867204.34471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867204.34517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867204.51680: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 17:20:04.493894", "end": "2024-09-20 17:20:04.515064", "delta": "0:00:00.021170", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867204.53254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867204.53281: stderr chunk (state=3): >>><<< 13131 1726867204.53284: stdout chunk (state=3): >>><<< 13131 1726867204.53307: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 17:20:04.493894", "end": "2024-09-20 17:20:04.515064", "delta": "0:00:00.021170", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867204.53334: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867204.53341: _low_level_execute_command(): starting 13131 1726867204.53347: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867204.2596362-14186-66233609359804/ > /dev/null 2>&1 && sleep 0' 13131 1726867204.53790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867204.53796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867204.53802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.53805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867204.53807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867204.53860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867204.53863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867204.53864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867204.53903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867204.55685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867204.55709: stderr chunk (state=3): >>><<< 13131 1726867204.55712: stdout chunk (state=3): >>><<< 13131 1726867204.55725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867204.55732: handler run complete 13131 1726867204.55748: Evaluated conditional (False): False 13131 1726867204.55756: attempt loop complete, returning result 13131 1726867204.55758: _execute() done 13131 1726867204.55761: dumping result to json 13131 1726867204.55766: done dumping result, returning 13131 1726867204.55773: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affcac9-a3a5-5f24-9b7a-00000000062f] 13131 1726867204.55778: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000062f 13131 1726867204.55872: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000062f 13131 1726867204.55875: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.021170", "end": "2024-09-20 17:20:04.515064", "rc": 0, "start": "2024-09-20 17:20:04.493894" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 13131 1726867204.55945: no more pending results, returning what we have 13131 1726867204.55948: results queue empty 13131 1726867204.55949: checking for any_errors_fatal 13131 1726867204.55955: done checking for any_errors_fatal 13131 1726867204.55956: checking for max_fail_percentage 13131 1726867204.55958: done checking for max_fail_percentage 13131 1726867204.55958: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.55959: done checking to see if all hosts have failed 13131 1726867204.55960: getting the remaining hosts for this loop 13131 1726867204.55961: done getting the remaining hosts for this loop 13131 1726867204.55964: getting the next task for host managed_node1 13131 1726867204.55970: done getting next task for host managed_node1 13131 1726867204.55972: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13131 1726867204.55976: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.55982: getting variables 13131 1726867204.55983: in VariableManager get_vars() 13131 1726867204.56035: Calling all_inventory to load vars for managed_node1 13131 1726867204.56039: Calling groups_inventory to load vars for managed_node1 13131 1726867204.56041: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.56051: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.56053: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.56055: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.56853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.57813: done with get_vars() 13131 1726867204.57830: done getting variables 13131 1726867204.57871: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:20:04 -0400 (0:00:00.361) 0:00:19.689 ****** 13131 1726867204.57898: entering _queue_task() for managed_node1/set_fact 13131 1726867204.58139: worker is 1 (out of 1 available) 13131 1726867204.58152: exiting _queue_task() for managed_node1/set_fact 13131 1726867204.58164: done queuing things up, now waiting for results queue to drain 13131 1726867204.58165: waiting for pending results... 13131 1726867204.58338: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13131 1726867204.58415: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000630 13131 1726867204.58428: variable 'ansible_search_path' from source: unknown 13131 1726867204.58431: variable 'ansible_search_path' from source: unknown 13131 1726867204.58461: calling self._execute() 13131 1726867204.58537: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.58541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.58550: variable 'omit' from source: magic vars 13131 1726867204.58833: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.58840: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.58932: variable 'nm_profile_exists' from source: set_fact 13131 1726867204.58944: Evaluated conditional (nm_profile_exists.rc == 0): True 13131 1726867204.58951: variable 'omit' from source: magic vars 13131 1726867204.58981: variable 'omit' from source: magic vars 13131 1726867204.59005: variable 'omit' from source: magic vars 13131 1726867204.59035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867204.59065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867204.59082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867204.59098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.59107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.59131: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867204.59134: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.59137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.59205: Set connection var ansible_connection to ssh 13131 1726867204.59212: Set connection var ansible_timeout to 10 13131 1726867204.59215: Set connection var ansible_shell_type to sh 13131 1726867204.59223: Set connection var ansible_shell_executable to /bin/sh 13131 1726867204.59230: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867204.59235: Set connection var ansible_pipelining to False 13131 1726867204.59252: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.59255: variable 'ansible_connection' from source: unknown 13131 1726867204.59258: variable 'ansible_module_compression' from source: unknown 13131 1726867204.59262: variable 'ansible_shell_type' from source: unknown 13131 1726867204.59265: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.59267: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.59269: variable 'ansible_pipelining' from source: unknown 13131 1726867204.59272: variable 'ansible_timeout' from source: unknown 13131 1726867204.59274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.59372: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867204.59381: variable 'omit' from source: magic vars 13131 1726867204.59388: starting attempt loop 13131 1726867204.59391: running the handler 13131 1726867204.59404: handler run complete 13131 1726867204.59411: attempt loop complete, returning result 13131 1726867204.59413: _execute() done 13131 1726867204.59416: dumping result to json 13131 1726867204.59418: done dumping result, returning 13131 1726867204.59426: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-5f24-9b7a-000000000630] 13131 1726867204.59430: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000630 13131 1726867204.59512: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000630 13131 1726867204.59514: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13131 1726867204.59566: no more pending results, returning what we have 13131 1726867204.59569: results queue empty 13131 1726867204.59570: checking for any_errors_fatal 13131 1726867204.59582: done checking for any_errors_fatal 13131 1726867204.59582: checking for max_fail_percentage 13131 1726867204.59585: done checking for max_fail_percentage 13131 1726867204.59585: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.59586: done checking to see if all hosts have failed 13131 1726867204.59586: getting the remaining hosts for this loop 13131 1726867204.59588: done getting the remaining hosts for this loop 13131 1726867204.59591: getting the next task for host managed_node1 13131 1726867204.59602: done getting next task for host managed_node1 13131 1726867204.59604: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13131 1726867204.59609: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.59612: getting variables 13131 1726867204.59613: in VariableManager get_vars() 13131 1726867204.59667: Calling all_inventory to load vars for managed_node1 13131 1726867204.59670: Calling groups_inventory to load vars for managed_node1 13131 1726867204.59672: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.59683: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.59685: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.59688: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.60485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.61366: done with get_vars() 13131 1726867204.61384: done getting variables 13131 1726867204.61426: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867204.61513: variable 'profile' from source: include params 13131 1726867204.61516: variable 'item' from source: include params 13131 1726867204.61555: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:20:04 -0400 (0:00:00.036) 0:00:19.726 ****** 13131 1726867204.61585: entering _queue_task() for managed_node1/command 13131 1726867204.61827: worker is 1 (out of 1 available) 13131 1726867204.61839: exiting _queue_task() for managed_node1/command 13131 1726867204.61849: done queuing things up, now waiting for results queue to drain 13131 1726867204.61851: waiting for pending results... 13131 1726867204.62032: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 13131 1726867204.62116: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000632 13131 1726867204.62126: variable 'ansible_search_path' from source: unknown 13131 1726867204.62129: variable 'ansible_search_path' from source: unknown 13131 1726867204.62159: calling self._execute() 13131 1726867204.62239: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.62243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.62252: variable 'omit' from source: magic vars 13131 1726867204.62537: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.62547: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.62632: variable 'profile_stat' from source: set_fact 13131 1726867204.62643: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867204.62647: when evaluation is False, skipping this task 13131 1726867204.62649: _execute() done 13131 1726867204.62652: dumping result to json 13131 1726867204.62654: done dumping result, returning 13131 1726867204.62661: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affcac9-a3a5-5f24-9b7a-000000000632] 13131 1726867204.62665: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000632 13131 1726867204.62744: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000632 13131 1726867204.62746: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867204.62798: no more pending results, returning what we have 13131 1726867204.62802: results queue empty 13131 1726867204.62803: checking for any_errors_fatal 13131 1726867204.62809: done checking for any_errors_fatal 13131 1726867204.62810: checking for max_fail_percentage 13131 1726867204.62811: done checking for max_fail_percentage 13131 1726867204.62812: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.62813: done checking to see if all hosts have failed 13131 1726867204.62814: getting the remaining hosts for this loop 13131 1726867204.62815: done getting the remaining hosts for this loop 13131 1726867204.62818: getting the next task for host managed_node1 13131 1726867204.62825: done getting next task for host managed_node1 13131 1726867204.62827: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13131 1726867204.62831: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.62835: getting variables 13131 1726867204.62837: in VariableManager get_vars() 13131 1726867204.62887: Calling all_inventory to load vars for managed_node1 13131 1726867204.62889: Calling groups_inventory to load vars for managed_node1 13131 1726867204.62891: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.62902: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.62904: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.62907: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.63811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.64654: done with get_vars() 13131 1726867204.64668: done getting variables 13131 1726867204.64709: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867204.64785: variable 'profile' from source: include params 13131 1726867204.64788: variable 'item' from source: include params 13131 1726867204.64825: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:20:04 -0400 (0:00:00.032) 0:00:19.759 ****** 13131 1726867204.64850: entering _queue_task() for managed_node1/set_fact 13131 1726867204.65073: worker is 1 (out of 1 available) 13131 1726867204.65087: exiting _queue_task() for managed_node1/set_fact 13131 1726867204.65098: done queuing things up, now waiting for results queue to drain 13131 1726867204.65099: waiting for pending results... 13131 1726867204.65261: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 13131 1726867204.65338: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000633 13131 1726867204.65349: variable 'ansible_search_path' from source: unknown 13131 1726867204.65353: variable 'ansible_search_path' from source: unknown 13131 1726867204.65382: calling self._execute() 13131 1726867204.65450: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.65456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.65464: variable 'omit' from source: magic vars 13131 1726867204.65730: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.65739: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.65825: variable 'profile_stat' from source: set_fact 13131 1726867204.65835: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867204.65838: when evaluation is False, skipping this task 13131 1726867204.65841: _execute() done 13131 1726867204.65844: dumping result to json 13131 1726867204.65846: done dumping result, returning 13131 1726867204.65853: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affcac9-a3a5-5f24-9b7a-000000000633] 13131 1726867204.65856: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000633 13131 1726867204.65936: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000633 13131 1726867204.65939: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867204.66016: no more pending results, returning what we have 13131 1726867204.66019: results queue empty 13131 1726867204.66020: checking for any_errors_fatal 13131 1726867204.66024: done checking for any_errors_fatal 13131 1726867204.66025: checking for max_fail_percentage 13131 1726867204.66027: done checking for max_fail_percentage 13131 1726867204.66028: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.66028: done checking to see if all hosts have failed 13131 1726867204.66029: getting the remaining hosts for this loop 13131 1726867204.66030: done getting the remaining hosts for this loop 13131 1726867204.66032: getting the next task for host managed_node1 13131 1726867204.66039: done getting next task for host managed_node1 13131 1726867204.66041: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13131 1726867204.66045: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.66048: getting variables 13131 1726867204.66049: in VariableManager get_vars() 13131 1726867204.66091: Calling all_inventory to load vars for managed_node1 13131 1726867204.66094: Calling groups_inventory to load vars for managed_node1 13131 1726867204.66096: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.66105: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.66107: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.66109: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.66842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.67703: done with get_vars() 13131 1726867204.67717: done getting variables 13131 1726867204.67759: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867204.67834: variable 'profile' from source: include params 13131 1726867204.67837: variable 'item' from source: include params 13131 1726867204.67875: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:20:04 -0400 (0:00:00.030) 0:00:19.789 ****** 13131 1726867204.67899: entering _queue_task() for managed_node1/command 13131 1726867204.68122: worker is 1 (out of 1 available) 13131 1726867204.68137: exiting _queue_task() for managed_node1/command 13131 1726867204.68148: done queuing things up, now waiting for results queue to drain 13131 1726867204.68149: waiting for pending results... 13131 1726867204.68318: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 13131 1726867204.68389: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000634 13131 1726867204.68401: variable 'ansible_search_path' from source: unknown 13131 1726867204.68405: variable 'ansible_search_path' from source: unknown 13131 1726867204.68433: calling self._execute() 13131 1726867204.68511: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.68515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.68523: variable 'omit' from source: magic vars 13131 1726867204.68785: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.68795: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.68876: variable 'profile_stat' from source: set_fact 13131 1726867204.68889: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867204.68892: when evaluation is False, skipping this task 13131 1726867204.68895: _execute() done 13131 1726867204.68900: dumping result to json 13131 1726867204.68903: done dumping result, returning 13131 1726867204.68909: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affcac9-a3a5-5f24-9b7a-000000000634] 13131 1726867204.68914: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000634 13131 1726867204.68997: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000634 13131 1726867204.69000: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867204.69071: no more pending results, returning what we have 13131 1726867204.69075: results queue empty 13131 1726867204.69076: checking for any_errors_fatal 13131 1726867204.69082: done checking for any_errors_fatal 13131 1726867204.69083: checking for max_fail_percentage 13131 1726867204.69084: done checking for max_fail_percentage 13131 1726867204.69085: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.69086: done checking to see if all hosts have failed 13131 1726867204.69086: getting the remaining hosts for this loop 13131 1726867204.69088: done getting the remaining hosts for this loop 13131 1726867204.69090: getting the next task for host managed_node1 13131 1726867204.69096: done getting next task for host managed_node1 13131 1726867204.69098: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13131 1726867204.69101: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.69104: getting variables 13131 1726867204.69106: in VariableManager get_vars() 13131 1726867204.69146: Calling all_inventory to load vars for managed_node1 13131 1726867204.69148: Calling groups_inventory to load vars for managed_node1 13131 1726867204.69150: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.69159: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.69161: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.69164: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.70021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.70874: done with get_vars() 13131 1726867204.70890: done getting variables 13131 1726867204.70935: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867204.71012: variable 'profile' from source: include params 13131 1726867204.71015: variable 'item' from source: include params 13131 1726867204.71054: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:20:04 -0400 (0:00:00.031) 0:00:19.821 ****** 13131 1726867204.71075: entering _queue_task() for managed_node1/set_fact 13131 1726867204.71299: worker is 1 (out of 1 available) 13131 1726867204.71312: exiting _queue_task() for managed_node1/set_fact 13131 1726867204.71324: done queuing things up, now waiting for results queue to drain 13131 1726867204.71325: waiting for pending results... 13131 1726867204.71492: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 13131 1726867204.71563: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000635 13131 1726867204.71575: variable 'ansible_search_path' from source: unknown 13131 1726867204.71579: variable 'ansible_search_path' from source: unknown 13131 1726867204.71610: calling self._execute() 13131 1726867204.71680: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.71690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.71693: variable 'omit' from source: magic vars 13131 1726867204.71954: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.71964: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.72047: variable 'profile_stat' from source: set_fact 13131 1726867204.72059: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867204.72062: when evaluation is False, skipping this task 13131 1726867204.72065: _execute() done 13131 1726867204.72067: dumping result to json 13131 1726867204.72070: done dumping result, returning 13131 1726867204.72075: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affcac9-a3a5-5f24-9b7a-000000000635] 13131 1726867204.72082: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000635 13131 1726867204.72165: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000635 13131 1726867204.72168: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867204.72235: no more pending results, returning what we have 13131 1726867204.72239: results queue empty 13131 1726867204.72240: checking for any_errors_fatal 13131 1726867204.72245: done checking for any_errors_fatal 13131 1726867204.72246: checking for max_fail_percentage 13131 1726867204.72248: done checking for max_fail_percentage 13131 1726867204.72248: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.72249: done checking to see if all hosts have failed 13131 1726867204.72250: getting the remaining hosts for this loop 13131 1726867204.72251: done getting the remaining hosts for this loop 13131 1726867204.72254: getting the next task for host managed_node1 13131 1726867204.72260: done getting next task for host managed_node1 13131 1726867204.72262: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13131 1726867204.72265: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.72269: getting variables 13131 1726867204.72270: in VariableManager get_vars() 13131 1726867204.72311: Calling all_inventory to load vars for managed_node1 13131 1726867204.72314: Calling groups_inventory to load vars for managed_node1 13131 1726867204.72330: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.72342: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.72345: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.72348: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.73413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.75624: done with get_vars() 13131 1726867204.75648: done getting variables 13131 1726867204.75824: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867204.75939: variable 'profile' from source: include params 13131 1726867204.75943: variable 'item' from source: include params 13131 1726867204.76041: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:20:04 -0400 (0:00:00.049) 0:00:19.871 ****** 13131 1726867204.76073: entering _queue_task() for managed_node1/assert 13131 1726867204.76421: worker is 1 (out of 1 available) 13131 1726867204.76434: exiting _queue_task() for managed_node1/assert 13131 1726867204.76448: done queuing things up, now waiting for results queue to drain 13131 1726867204.76449: waiting for pending results... 13131 1726867204.76997: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' 13131 1726867204.77026: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000035d 13131 1726867204.77088: variable 'ansible_search_path' from source: unknown 13131 1726867204.77096: variable 'ansible_search_path' from source: unknown 13131 1726867204.77110: calling self._execute() 13131 1726867204.77223: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.77238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.77306: variable 'omit' from source: magic vars 13131 1726867204.77670: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.77689: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.77743: variable 'omit' from source: magic vars 13131 1726867204.77761: variable 'omit' from source: magic vars 13131 1726867204.77867: variable 'profile' from source: include params 13131 1726867204.77870: variable 'item' from source: include params 13131 1726867204.77960: variable 'item' from source: include params 13131 1726867204.78068: variable 'omit' from source: magic vars 13131 1726867204.78071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867204.78074: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867204.78105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867204.78128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.78178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.78189: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867204.78203: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.78211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.78476: Set connection var ansible_connection to ssh 13131 1726867204.78492: Set connection var ansible_timeout to 10 13131 1726867204.78529: Set connection var ansible_shell_type to sh 13131 1726867204.78541: Set connection var ansible_shell_executable to /bin/sh 13131 1726867204.78552: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867204.78633: Set connection var ansible_pipelining to False 13131 1726867204.78636: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.78638: variable 'ansible_connection' from source: unknown 13131 1726867204.78641: variable 'ansible_module_compression' from source: unknown 13131 1726867204.78643: variable 'ansible_shell_type' from source: unknown 13131 1726867204.78645: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.78647: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.78652: variable 'ansible_pipelining' from source: unknown 13131 1726867204.78659: variable 'ansible_timeout' from source: unknown 13131 1726867204.78666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.78818: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867204.78834: variable 'omit' from source: magic vars 13131 1726867204.78849: starting attempt loop 13131 1726867204.78856: running the handler 13131 1726867204.78982: variable 'lsr_net_profile_exists' from source: set_fact 13131 1726867204.78985: Evaluated conditional (lsr_net_profile_exists): True 13131 1726867204.79016: handler run complete 13131 1726867204.79019: attempt loop complete, returning result 13131 1726867204.79021: _execute() done 13131 1726867204.79026: dumping result to json 13131 1726867204.79033: done dumping result, returning 13131 1726867204.79043: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' [0affcac9-a3a5-5f24-9b7a-00000000035d] 13131 1726867204.79050: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000035d ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867204.79298: no more pending results, returning what we have 13131 1726867204.79302: results queue empty 13131 1726867204.79303: checking for any_errors_fatal 13131 1726867204.79309: done checking for any_errors_fatal 13131 1726867204.79309: checking for max_fail_percentage 13131 1726867204.79311: done checking for max_fail_percentage 13131 1726867204.79312: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.79313: done checking to see if all hosts have failed 13131 1726867204.79314: getting the remaining hosts for this loop 13131 1726867204.79315: done getting the remaining hosts for this loop 13131 1726867204.79318: getting the next task for host managed_node1 13131 1726867204.79324: done getting next task for host managed_node1 13131 1726867204.79327: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13131 1726867204.79330: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.79334: getting variables 13131 1726867204.79336: in VariableManager get_vars() 13131 1726867204.79396: Calling all_inventory to load vars for managed_node1 13131 1726867204.79399: Calling groups_inventory to load vars for managed_node1 13131 1726867204.79402: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.79413: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.79416: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.79418: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.79991: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000035d 13131 1726867204.79997: WORKER PROCESS EXITING 13131 1726867204.80999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.82701: done with get_vars() 13131 1726867204.82721: done getting variables 13131 1726867204.82784: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867204.82910: variable 'profile' from source: include params 13131 1726867204.82915: variable 'item' from source: include params 13131 1726867204.82980: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:20:04 -0400 (0:00:00.069) 0:00:19.940 ****** 13131 1726867204.83021: entering _queue_task() for managed_node1/assert 13131 1726867204.83346: worker is 1 (out of 1 available) 13131 1726867204.83358: exiting _queue_task() for managed_node1/assert 13131 1726867204.83368: done queuing things up, now waiting for results queue to drain 13131 1726867204.83369: waiting for pending results... 13131 1726867204.83655: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 13131 1726867204.83766: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000035e 13131 1726867204.83791: variable 'ansible_search_path' from source: unknown 13131 1726867204.83802: variable 'ansible_search_path' from source: unknown 13131 1726867204.83845: calling self._execute() 13131 1726867204.83950: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.83963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.83980: variable 'omit' from source: magic vars 13131 1726867204.84374: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.84380: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.84386: variable 'omit' from source: magic vars 13131 1726867204.84433: variable 'omit' from source: magic vars 13131 1726867204.84550: variable 'profile' from source: include params 13131 1726867204.84583: variable 'item' from source: include params 13131 1726867204.84639: variable 'item' from source: include params 13131 1726867204.84664: variable 'omit' from source: magic vars 13131 1726867204.84783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867204.84787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867204.84789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867204.84819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.84839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.84879: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867204.84890: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.84904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.85009: Set connection var ansible_connection to ssh 13131 1726867204.85030: Set connection var ansible_timeout to 10 13131 1726867204.85039: Set connection var ansible_shell_type to sh 13131 1726867204.85053: Set connection var ansible_shell_executable to /bin/sh 13131 1726867204.85068: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867204.85140: Set connection var ansible_pipelining to False 13131 1726867204.85144: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.85146: variable 'ansible_connection' from source: unknown 13131 1726867204.85149: variable 'ansible_module_compression' from source: unknown 13131 1726867204.85151: variable 'ansible_shell_type' from source: unknown 13131 1726867204.85153: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.85155: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.85157: variable 'ansible_pipelining' from source: unknown 13131 1726867204.85159: variable 'ansible_timeout' from source: unknown 13131 1726867204.85161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.85328: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867204.85347: variable 'omit' from source: magic vars 13131 1726867204.85367: starting attempt loop 13131 1726867204.85375: running the handler 13131 1726867204.85505: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13131 1726867204.85578: Evaluated conditional (lsr_net_profile_ansible_managed): True 13131 1726867204.85582: handler run complete 13131 1726867204.85584: attempt loop complete, returning result 13131 1726867204.85586: _execute() done 13131 1726867204.85588: dumping result to json 13131 1726867204.85590: done dumping result, returning 13131 1726867204.85595: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affcac9-a3a5-5f24-9b7a-00000000035e] 13131 1726867204.85597: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000035e ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867204.85735: no more pending results, returning what we have 13131 1726867204.85739: results queue empty 13131 1726867204.85740: checking for any_errors_fatal 13131 1726867204.85748: done checking for any_errors_fatal 13131 1726867204.85749: checking for max_fail_percentage 13131 1726867204.85751: done checking for max_fail_percentage 13131 1726867204.85752: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.85752: done checking to see if all hosts have failed 13131 1726867204.85753: getting the remaining hosts for this loop 13131 1726867204.85755: done getting the remaining hosts for this loop 13131 1726867204.85758: getting the next task for host managed_node1 13131 1726867204.85765: done getting next task for host managed_node1 13131 1726867204.85768: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13131 1726867204.85771: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.85775: getting variables 13131 1726867204.85883: in VariableManager get_vars() 13131 1726867204.85938: Calling all_inventory to load vars for managed_node1 13131 1726867204.85941: Calling groups_inventory to load vars for managed_node1 13131 1726867204.85943: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.85955: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.85958: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.85961: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.86480: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000035e 13131 1726867204.86484: WORKER PROCESS EXITING 13131 1726867204.87945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.89599: done with get_vars() 13131 1726867204.89618: done getting variables 13131 1726867204.89679: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867204.89789: variable 'profile' from source: include params 13131 1726867204.89795: variable 'item' from source: include params 13131 1726867204.89850: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:20:04 -0400 (0:00:00.068) 0:00:20.009 ****** 13131 1726867204.89891: entering _queue_task() for managed_node1/assert 13131 1726867204.90163: worker is 1 (out of 1 available) 13131 1726867204.90175: exiting _queue_task() for managed_node1/assert 13131 1726867204.90189: done queuing things up, now waiting for results queue to drain 13131 1726867204.90191: waiting for pending results... 13131 1726867204.90595: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 13131 1726867204.90599: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000035f 13131 1726867204.90603: variable 'ansible_search_path' from source: unknown 13131 1726867204.90607: variable 'ansible_search_path' from source: unknown 13131 1726867204.90614: calling self._execute() 13131 1726867204.90703: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.90709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.90726: variable 'omit' from source: magic vars 13131 1726867204.91088: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.91103: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.91110: variable 'omit' from source: magic vars 13131 1726867204.91145: variable 'omit' from source: magic vars 13131 1726867204.91246: variable 'profile' from source: include params 13131 1726867204.91250: variable 'item' from source: include params 13131 1726867204.91324: variable 'item' from source: include params 13131 1726867204.91342: variable 'omit' from source: magic vars 13131 1726867204.91385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867204.91422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867204.91439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867204.91456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.91466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867204.91506: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867204.91510: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.91513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.91682: Set connection var ansible_connection to ssh 13131 1726867204.91686: Set connection var ansible_timeout to 10 13131 1726867204.91688: Set connection var ansible_shell_type to sh 13131 1726867204.91690: Set connection var ansible_shell_executable to /bin/sh 13131 1726867204.91692: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867204.91694: Set connection var ansible_pipelining to False 13131 1726867204.91696: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.91698: variable 'ansible_connection' from source: unknown 13131 1726867204.91701: variable 'ansible_module_compression' from source: unknown 13131 1726867204.91704: variable 'ansible_shell_type' from source: unknown 13131 1726867204.91706: variable 'ansible_shell_executable' from source: unknown 13131 1726867204.91709: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.91711: variable 'ansible_pipelining' from source: unknown 13131 1726867204.91714: variable 'ansible_timeout' from source: unknown 13131 1726867204.91717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.91845: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867204.91856: variable 'omit' from source: magic vars 13131 1726867204.91861: starting attempt loop 13131 1726867204.91864: running the handler 13131 1726867204.92182: variable 'lsr_net_profile_fingerprint' from source: set_fact 13131 1726867204.92185: Evaluated conditional (lsr_net_profile_fingerprint): True 13131 1726867204.92187: handler run complete 13131 1726867204.92189: attempt loop complete, returning result 13131 1726867204.92191: _execute() done 13131 1726867204.92193: dumping result to json 13131 1726867204.92195: done dumping result, returning 13131 1726867204.92197: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 [0affcac9-a3a5-5f24-9b7a-00000000035f] 13131 1726867204.92199: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000035f 13131 1726867204.92251: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000035f 13131 1726867204.92254: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867204.92300: no more pending results, returning what we have 13131 1726867204.92303: results queue empty 13131 1726867204.92304: checking for any_errors_fatal 13131 1726867204.92308: done checking for any_errors_fatal 13131 1726867204.92309: checking for max_fail_percentage 13131 1726867204.92311: done checking for max_fail_percentage 13131 1726867204.92311: checking to see if all hosts have failed and the running result is not ok 13131 1726867204.92312: done checking to see if all hosts have failed 13131 1726867204.92313: getting the remaining hosts for this loop 13131 1726867204.92314: done getting the remaining hosts for this loop 13131 1726867204.92317: getting the next task for host managed_node1 13131 1726867204.92325: done getting next task for host managed_node1 13131 1726867204.92327: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13131 1726867204.92330: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867204.92333: getting variables 13131 1726867204.92334: in VariableManager get_vars() 13131 1726867204.92379: Calling all_inventory to load vars for managed_node1 13131 1726867204.92381: Calling groups_inventory to load vars for managed_node1 13131 1726867204.92384: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.92394: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.92397: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.92400: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.93780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.95486: done with get_vars() 13131 1726867204.95509: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:20:04 -0400 (0:00:00.057) 0:00:20.066 ****** 13131 1726867204.95629: entering _queue_task() for managed_node1/include_tasks 13131 1726867204.95995: worker is 1 (out of 1 available) 13131 1726867204.96007: exiting _queue_task() for managed_node1/include_tasks 13131 1726867204.96024: done queuing things up, now waiting for results queue to drain 13131 1726867204.96026: waiting for pending results... 13131 1726867204.96296: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 13131 1726867204.96415: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000363 13131 1726867204.96428: variable 'ansible_search_path' from source: unknown 13131 1726867204.96431: variable 'ansible_search_path' from source: unknown 13131 1726867204.96473: calling self._execute() 13131 1726867204.96571: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867204.96578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867204.96590: variable 'omit' from source: magic vars 13131 1726867204.96951: variable 'ansible_distribution_major_version' from source: facts 13131 1726867204.96961: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867204.96967: _execute() done 13131 1726867204.96970: dumping result to json 13131 1726867204.96974: done dumping result, returning 13131 1726867204.96999: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-5f24-9b7a-000000000363] 13131 1726867204.97002: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000363 13131 1726867204.97083: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000363 13131 1726867204.97085: WORKER PROCESS EXITING 13131 1726867204.97127: no more pending results, returning what we have 13131 1726867204.97131: in VariableManager get_vars() 13131 1726867204.97186: Calling all_inventory to load vars for managed_node1 13131 1726867204.97189: Calling groups_inventory to load vars for managed_node1 13131 1726867204.97191: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867204.97201: Calling all_plugins_play to load vars for managed_node1 13131 1726867204.97203: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867204.97206: Calling groups_plugins_play to load vars for managed_node1 13131 1726867204.98203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867204.99046: done with get_vars() 13131 1726867204.99061: variable 'ansible_search_path' from source: unknown 13131 1726867204.99062: variable 'ansible_search_path' from source: unknown 13131 1726867204.99087: we have included files to process 13131 1726867204.99088: generating all_blocks data 13131 1726867204.99089: done generating all_blocks data 13131 1726867204.99091: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867204.99094: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867204.99095: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13131 1726867204.99943: done processing included file 13131 1726867204.99945: iterating over new_blocks loaded from include file 13131 1726867204.99946: in VariableManager get_vars() 13131 1726867204.99971: done with get_vars() 13131 1726867204.99973: filtering new block on tags 13131 1726867205.00002: done filtering new block on tags 13131 1726867205.00005: in VariableManager get_vars() 13131 1726867205.00030: done with get_vars() 13131 1726867205.00032: filtering new block on tags 13131 1726867205.00055: done filtering new block on tags 13131 1726867205.00057: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 13131 1726867205.00061: extending task lists for all hosts with included blocks 13131 1726867205.00233: done extending task lists 13131 1726867205.00234: done processing included files 13131 1726867205.00235: results queue empty 13131 1726867205.00236: checking for any_errors_fatal 13131 1726867205.00239: done checking for any_errors_fatal 13131 1726867205.00239: checking for max_fail_percentage 13131 1726867205.00240: done checking for max_fail_percentage 13131 1726867205.00241: checking to see if all hosts have failed and the running result is not ok 13131 1726867205.00242: done checking to see if all hosts have failed 13131 1726867205.00243: getting the remaining hosts for this loop 13131 1726867205.00244: done getting the remaining hosts for this loop 13131 1726867205.00250: getting the next task for host managed_node1 13131 1726867205.00254: done getting next task for host managed_node1 13131 1726867205.00256: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13131 1726867205.00259: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867205.00262: getting variables 13131 1726867205.00263: in VariableManager get_vars() 13131 1726867205.00275: Calling all_inventory to load vars for managed_node1 13131 1726867205.00278: Calling groups_inventory to load vars for managed_node1 13131 1726867205.00280: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867205.00284: Calling all_plugins_play to load vars for managed_node1 13131 1726867205.00285: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867205.00287: Calling groups_plugins_play to load vars for managed_node1 13131 1726867205.00925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867205.01803: done with get_vars() 13131 1726867205.01817: done getting variables 13131 1726867205.01841: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:20:05 -0400 (0:00:00.062) 0:00:20.129 ****** 13131 1726867205.01861: entering _queue_task() for managed_node1/set_fact 13131 1726867205.02059: worker is 1 (out of 1 available) 13131 1726867205.02071: exiting _queue_task() for managed_node1/set_fact 13131 1726867205.02084: done queuing things up, now waiting for results queue to drain 13131 1726867205.02085: waiting for pending results... 13131 1726867205.02258: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 13131 1726867205.02354: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000674 13131 1726867205.02359: variable 'ansible_search_path' from source: unknown 13131 1726867205.02362: variable 'ansible_search_path' from source: unknown 13131 1726867205.02597: calling self._execute() 13131 1726867205.02601: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.02604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.02607: variable 'omit' from source: magic vars 13131 1726867205.02874: variable 'ansible_distribution_major_version' from source: facts 13131 1726867205.02896: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867205.02909: variable 'omit' from source: magic vars 13131 1726867205.02966: variable 'omit' from source: magic vars 13131 1726867205.03010: variable 'omit' from source: magic vars 13131 1726867205.03063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867205.03103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867205.03127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867205.03150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867205.03175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867205.03211: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867205.03273: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.03278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.03332: Set connection var ansible_connection to ssh 13131 1726867205.03344: Set connection var ansible_timeout to 10 13131 1726867205.03352: Set connection var ansible_shell_type to sh 13131 1726867205.03365: Set connection var ansible_shell_executable to /bin/sh 13131 1726867205.03395: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867205.03407: Set connection var ansible_pipelining to False 13131 1726867205.03430: variable 'ansible_shell_executable' from source: unknown 13131 1726867205.03438: variable 'ansible_connection' from source: unknown 13131 1726867205.03445: variable 'ansible_module_compression' from source: unknown 13131 1726867205.03484: variable 'ansible_shell_type' from source: unknown 13131 1726867205.03495: variable 'ansible_shell_executable' from source: unknown 13131 1726867205.03498: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.03501: variable 'ansible_pipelining' from source: unknown 13131 1726867205.03503: variable 'ansible_timeout' from source: unknown 13131 1726867205.03505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.03617: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867205.03627: variable 'omit' from source: magic vars 13131 1726867205.03632: starting attempt loop 13131 1726867205.03635: running the handler 13131 1726867205.03646: handler run complete 13131 1726867205.03653: attempt loop complete, returning result 13131 1726867205.03655: _execute() done 13131 1726867205.03658: dumping result to json 13131 1726867205.03660: done dumping result, returning 13131 1726867205.03667: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-5f24-9b7a-000000000674] 13131 1726867205.03671: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000674 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13131 1726867205.03826: no more pending results, returning what we have 13131 1726867205.03829: results queue empty 13131 1726867205.03830: checking for any_errors_fatal 13131 1726867205.03831: done checking for any_errors_fatal 13131 1726867205.03832: checking for max_fail_percentage 13131 1726867205.03833: done checking for max_fail_percentage 13131 1726867205.03834: checking to see if all hosts have failed and the running result is not ok 13131 1726867205.03834: done checking to see if all hosts have failed 13131 1726867205.03835: getting the remaining hosts for this loop 13131 1726867205.03836: done getting the remaining hosts for this loop 13131 1726867205.03839: getting the next task for host managed_node1 13131 1726867205.03845: done getting next task for host managed_node1 13131 1726867205.03846: ^ task is: TASK: Stat profile file 13131 1726867205.03850: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867205.03853: getting variables 13131 1726867205.03854: in VariableManager get_vars() 13131 1726867205.03895: Calling all_inventory to load vars for managed_node1 13131 1726867205.03897: Calling groups_inventory to load vars for managed_node1 13131 1726867205.03899: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867205.03908: Calling all_plugins_play to load vars for managed_node1 13131 1726867205.03911: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867205.03922: Calling groups_plugins_play to load vars for managed_node1 13131 1726867205.04442: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000674 13131 1726867205.04446: WORKER PROCESS EXITING 13131 1726867205.04653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867205.05781: done with get_vars() 13131 1726867205.05805: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:20:05 -0400 (0:00:00.040) 0:00:20.169 ****** 13131 1726867205.05885: entering _queue_task() for managed_node1/stat 13131 1726867205.06122: worker is 1 (out of 1 available) 13131 1726867205.06134: exiting _queue_task() for managed_node1/stat 13131 1726867205.06146: done queuing things up, now waiting for results queue to drain 13131 1726867205.06148: waiting for pending results... 13131 1726867205.06494: running TaskExecutor() for managed_node1/TASK: Stat profile file 13131 1726867205.06521: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000675 13131 1726867205.06535: variable 'ansible_search_path' from source: unknown 13131 1726867205.06538: variable 'ansible_search_path' from source: unknown 13131 1726867205.06573: calling self._execute() 13131 1726867205.06665: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.06670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.06683: variable 'omit' from source: magic vars 13131 1726867205.07062: variable 'ansible_distribution_major_version' from source: facts 13131 1726867205.07085: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867205.07089: variable 'omit' from source: magic vars 13131 1726867205.07120: variable 'omit' from source: magic vars 13131 1726867205.07218: variable 'profile' from source: include params 13131 1726867205.07221: variable 'item' from source: include params 13131 1726867205.07268: variable 'item' from source: include params 13131 1726867205.07285: variable 'omit' from source: magic vars 13131 1726867205.07322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867205.07347: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867205.07361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867205.07381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867205.07389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867205.07414: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867205.07417: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.07420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.07489: Set connection var ansible_connection to ssh 13131 1726867205.07493: Set connection var ansible_timeout to 10 13131 1726867205.07495: Set connection var ansible_shell_type to sh 13131 1726867205.07504: Set connection var ansible_shell_executable to /bin/sh 13131 1726867205.07511: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867205.07516: Set connection var ansible_pipelining to False 13131 1726867205.07533: variable 'ansible_shell_executable' from source: unknown 13131 1726867205.07536: variable 'ansible_connection' from source: unknown 13131 1726867205.07539: variable 'ansible_module_compression' from source: unknown 13131 1726867205.07541: variable 'ansible_shell_type' from source: unknown 13131 1726867205.07543: variable 'ansible_shell_executable' from source: unknown 13131 1726867205.07545: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.07549: variable 'ansible_pipelining' from source: unknown 13131 1726867205.07551: variable 'ansible_timeout' from source: unknown 13131 1726867205.07556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.07704: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867205.07708: variable 'omit' from source: magic vars 13131 1726867205.07713: starting attempt loop 13131 1726867205.07716: running the handler 13131 1726867205.07727: _low_level_execute_command(): starting 13131 1726867205.07734: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867205.08182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.08211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.08215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.08217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.08276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867205.08280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.08282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.08330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.09995: stdout chunk (state=3): >>>/root <<< 13131 1726867205.10094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.10120: stderr chunk (state=3): >>><<< 13131 1726867205.10123: stdout chunk (state=3): >>><<< 13131 1726867205.10144: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867205.10155: _low_level_execute_command(): starting 13131 1726867205.10159: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169 `" && echo ansible-tmp-1726867205.101441-14227-274472273186169="` echo /root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169 `" ) && sleep 0' 13131 1726867205.10545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.10549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867205.10585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.10595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867205.10598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867205.10600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.10636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.10648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.10701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.12591: stdout chunk (state=3): >>>ansible-tmp-1726867205.101441-14227-274472273186169=/root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169 <<< 13131 1726867205.12703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.12720: stderr chunk (state=3): >>><<< 13131 1726867205.12723: stdout chunk (state=3): >>><<< 13131 1726867205.12738: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867205.101441-14227-274472273186169=/root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867205.12772: variable 'ansible_module_compression' from source: unknown 13131 1726867205.12820: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13131 1726867205.12850: variable 'ansible_facts' from source: unknown 13131 1726867205.12911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/AnsiballZ_stat.py 13131 1726867205.13008: Sending initial data 13131 1726867205.13012: Sent initial data (152 bytes) 13131 1726867205.13435: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.13438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867205.13440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.13442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.13444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.13494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867205.13498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.13552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.15072: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867205.15079: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867205.15115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867205.15162: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpbfml9z6w /root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/AnsiballZ_stat.py <<< 13131 1726867205.15165: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/AnsiballZ_stat.py" <<< 13131 1726867205.15208: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpbfml9z6w" to remote "/root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/AnsiballZ_stat.py" <<< 13131 1726867205.15747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.15780: stderr chunk (state=3): >>><<< 13131 1726867205.15783: stdout chunk (state=3): >>><<< 13131 1726867205.15820: done transferring module to remote 13131 1726867205.15827: _low_level_execute_command(): starting 13131 1726867205.15831: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/ /root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/AnsiballZ_stat.py && sleep 0' 13131 1726867205.16246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.16250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.16252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867205.16254: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.16259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.16311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.16317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.16358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.18080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.18101: stderr chunk (state=3): >>><<< 13131 1726867205.18104: stdout chunk (state=3): >>><<< 13131 1726867205.18123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867205.18126: _low_level_execute_command(): starting 13131 1726867205.18129: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/AnsiballZ_stat.py && sleep 0' 13131 1726867205.18518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867205.18521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.18523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.18525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.18573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.18576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.18627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.33574: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13131 1726867205.35170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867205.35174: stdout chunk (state=3): >>><<< 13131 1726867205.35178: stderr chunk (state=3): >>><<< 13131 1726867205.35183: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867205.35186: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867205.35188: _low_level_execute_command(): starting 13131 1726867205.35190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867205.101441-14227-274472273186169/ > /dev/null 2>&1 && sleep 0' 13131 1726867205.35754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.35790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.35901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867205.35928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.35962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.36025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.37856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.37867: stdout chunk (state=3): >>><<< 13131 1726867205.37897: stderr chunk (state=3): >>><<< 13131 1726867205.38085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867205.38089: handler run complete 13131 1726867205.38094: attempt loop complete, returning result 13131 1726867205.38096: _execute() done 13131 1726867205.38101: dumping result to json 13131 1726867205.38103: done dumping result, returning 13131 1726867205.38105: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affcac9-a3a5-5f24-9b7a-000000000675] 13131 1726867205.38107: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000675 13131 1726867205.38181: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000675 13131 1726867205.38184: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 13131 1726867205.38251: no more pending results, returning what we have 13131 1726867205.38255: results queue empty 13131 1726867205.38256: checking for any_errors_fatal 13131 1726867205.38264: done checking for any_errors_fatal 13131 1726867205.38265: checking for max_fail_percentage 13131 1726867205.38267: done checking for max_fail_percentage 13131 1726867205.38267: checking to see if all hosts have failed and the running result is not ok 13131 1726867205.38268: done checking to see if all hosts have failed 13131 1726867205.38269: getting the remaining hosts for this loop 13131 1726867205.38271: done getting the remaining hosts for this loop 13131 1726867205.38275: getting the next task for host managed_node1 13131 1726867205.38286: done getting next task for host managed_node1 13131 1726867205.38289: ^ task is: TASK: Set NM profile exist flag based on the profile files 13131 1726867205.38296: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867205.38300: getting variables 13131 1726867205.38302: in VariableManager get_vars() 13131 1726867205.38361: Calling all_inventory to load vars for managed_node1 13131 1726867205.38364: Calling groups_inventory to load vars for managed_node1 13131 1726867205.38366: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867205.38626: Calling all_plugins_play to load vars for managed_node1 13131 1726867205.38631: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867205.38635: Calling groups_plugins_play to load vars for managed_node1 13131 1726867205.41465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867205.44830: done with get_vars() 13131 1726867205.44858: done getting variables 13131 1726867205.45031: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:20:05 -0400 (0:00:00.391) 0:00:20.561 ****** 13131 1726867205.45061: entering _queue_task() for managed_node1/set_fact 13131 1726867205.45835: worker is 1 (out of 1 available) 13131 1726867205.45961: exiting _queue_task() for managed_node1/set_fact 13131 1726867205.45974: done queuing things up, now waiting for results queue to drain 13131 1726867205.45976: waiting for pending results... 13131 1726867205.46594: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 13131 1726867205.46599: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000676 13131 1726867205.46602: variable 'ansible_search_path' from source: unknown 13131 1726867205.46604: variable 'ansible_search_path' from source: unknown 13131 1726867205.46784: calling self._execute() 13131 1726867205.46788: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.46790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.46982: variable 'omit' from source: magic vars 13131 1726867205.47782: variable 'ansible_distribution_major_version' from source: facts 13131 1726867205.47786: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867205.47788: variable 'profile_stat' from source: set_fact 13131 1726867205.48184: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867205.48187: when evaluation is False, skipping this task 13131 1726867205.48190: _execute() done 13131 1726867205.48192: dumping result to json 13131 1726867205.48194: done dumping result, returning 13131 1726867205.48196: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-5f24-9b7a-000000000676] 13131 1726867205.48198: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000676 13131 1726867205.48259: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000676 13131 1726867205.48263: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867205.48315: no more pending results, returning what we have 13131 1726867205.48319: results queue empty 13131 1726867205.48320: checking for any_errors_fatal 13131 1726867205.48327: done checking for any_errors_fatal 13131 1726867205.48328: checking for max_fail_percentage 13131 1726867205.48330: done checking for max_fail_percentage 13131 1726867205.48331: checking to see if all hosts have failed and the running result is not ok 13131 1726867205.48331: done checking to see if all hosts have failed 13131 1726867205.48332: getting the remaining hosts for this loop 13131 1726867205.48334: done getting the remaining hosts for this loop 13131 1726867205.48337: getting the next task for host managed_node1 13131 1726867205.48345: done getting next task for host managed_node1 13131 1726867205.48347: ^ task is: TASK: Get NM profile info 13131 1726867205.48352: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867205.48362: getting variables 13131 1726867205.48364: in VariableManager get_vars() 13131 1726867205.48426: Calling all_inventory to load vars for managed_node1 13131 1726867205.48429: Calling groups_inventory to load vars for managed_node1 13131 1726867205.48432: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867205.48444: Calling all_plugins_play to load vars for managed_node1 13131 1726867205.48448: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867205.48451: Calling groups_plugins_play to load vars for managed_node1 13131 1726867205.51260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867205.53360: done with get_vars() 13131 1726867205.53392: done getting variables 13131 1726867205.53450: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:20:05 -0400 (0:00:00.084) 0:00:20.645 ****** 13131 1726867205.53484: entering _queue_task() for managed_node1/shell 13131 1726867205.53949: worker is 1 (out of 1 available) 13131 1726867205.53960: exiting _queue_task() for managed_node1/shell 13131 1726867205.53970: done queuing things up, now waiting for results queue to drain 13131 1726867205.53971: waiting for pending results... 13131 1726867205.54174: running TaskExecutor() for managed_node1/TASK: Get NM profile info 13131 1726867205.54286: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000677 13131 1726867205.54305: variable 'ansible_search_path' from source: unknown 13131 1726867205.54308: variable 'ansible_search_path' from source: unknown 13131 1726867205.54343: calling self._execute() 13131 1726867205.54449: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.54456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.54465: variable 'omit' from source: magic vars 13131 1726867205.54880: variable 'ansible_distribution_major_version' from source: facts 13131 1726867205.54902: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867205.54905: variable 'omit' from source: magic vars 13131 1726867205.54959: variable 'omit' from source: magic vars 13131 1726867205.55069: variable 'profile' from source: include params 13131 1726867205.55072: variable 'item' from source: include params 13131 1726867205.55145: variable 'item' from source: include params 13131 1726867205.55176: variable 'omit' from source: magic vars 13131 1726867205.55207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867205.55241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867205.55269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867205.55288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867205.55302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867205.55331: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867205.55334: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.55336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.55582: Set connection var ansible_connection to ssh 13131 1726867205.55586: Set connection var ansible_timeout to 10 13131 1726867205.55588: Set connection var ansible_shell_type to sh 13131 1726867205.55591: Set connection var ansible_shell_executable to /bin/sh 13131 1726867205.55593: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867205.55595: Set connection var ansible_pipelining to False 13131 1726867205.55597: variable 'ansible_shell_executable' from source: unknown 13131 1726867205.55599: variable 'ansible_connection' from source: unknown 13131 1726867205.55601: variable 'ansible_module_compression' from source: unknown 13131 1726867205.55603: variable 'ansible_shell_type' from source: unknown 13131 1726867205.55605: variable 'ansible_shell_executable' from source: unknown 13131 1726867205.55607: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.55609: variable 'ansible_pipelining' from source: unknown 13131 1726867205.55612: variable 'ansible_timeout' from source: unknown 13131 1726867205.55614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.55681: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867205.55702: variable 'omit' from source: magic vars 13131 1726867205.55707: starting attempt loop 13131 1726867205.55710: running the handler 13131 1726867205.55720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867205.55740: _low_level_execute_command(): starting 13131 1726867205.55747: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867205.56871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867205.56893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.56912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867205.56933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867205.56981: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.57083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867205.57184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.57266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.58963: stdout chunk (state=3): >>>/root <<< 13131 1726867205.59284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.59288: stderr chunk (state=3): >>><<< 13131 1726867205.59292: stdout chunk (state=3): >>><<< 13131 1726867205.59295: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867205.59298: _low_level_execute_command(): starting 13131 1726867205.59301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704 `" && echo ansible-tmp-1726867205.5913832-14246-190265929553704="` echo /root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704 `" ) && sleep 0' 13131 1726867205.60083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867205.60086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867205.60089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.60092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867205.60094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867205.60096: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867205.60099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.60101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867205.60103: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867205.60105: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867205.60107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867205.60109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.60111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867205.60113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867205.60115: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867205.60118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.60191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867205.60201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.60210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.60279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.62160: stdout chunk (state=3): >>>ansible-tmp-1726867205.5913832-14246-190265929553704=/root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704 <<< 13131 1726867205.62258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.62291: stderr chunk (state=3): >>><<< 13131 1726867205.62294: stdout chunk (state=3): >>><<< 13131 1726867205.62315: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867205.5913832-14246-190265929553704=/root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867205.62345: variable 'ansible_module_compression' from source: unknown 13131 1726867205.62395: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867205.62432: variable 'ansible_facts' from source: unknown 13131 1726867205.62520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/AnsiballZ_command.py 13131 1726867205.62730: Sending initial data 13131 1726867205.62734: Sent initial data (156 bytes) 13131 1726867205.63833: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.63995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.64065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.65600: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867205.65653: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867205.65716: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpmh3w3jvm /root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/AnsiballZ_command.py <<< 13131 1726867205.65720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/AnsiballZ_command.py" <<< 13131 1726867205.65781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpmh3w3jvm" to remote "/root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/AnsiballZ_command.py" <<< 13131 1726867205.67030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.67034: stdout chunk (state=3): >>><<< 13131 1726867205.67041: stderr chunk (state=3): >>><<< 13131 1726867205.67221: done transferring module to remote 13131 1726867205.67314: _low_level_execute_command(): starting 13131 1726867205.67317: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/ /root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/AnsiballZ_command.py && sleep 0' 13131 1726867205.68208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.68324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.68339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.68410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.70231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.70241: stdout chunk (state=3): >>><<< 13131 1726867205.70252: stderr chunk (state=3): >>><<< 13131 1726867205.70270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867205.70282: _low_level_execute_command(): starting 13131 1726867205.70294: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/AnsiballZ_command.py && sleep 0' 13131 1726867205.71082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867205.71085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867205.71088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867205.71090: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.71092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.71179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867205.71183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.71251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.88224: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 17:20:05.859370", "end": "2024-09-20 17:20:05.880510", "delta": "0:00:00.021140", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867205.89729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867205.89754: stderr chunk (state=3): >>><<< 13131 1726867205.89757: stdout chunk (state=3): >>><<< 13131 1726867205.89776: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 17:20:05.859370", "end": "2024-09-20 17:20:05.880510", "delta": "0:00:00.021140", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867205.89809: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867205.89818: _low_level_execute_command(): starting 13131 1726867205.89820: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867205.5913832-14246-190265929553704/ > /dev/null 2>&1 && sleep 0' 13131 1726867205.90259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867205.90263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.90265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867205.90267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867205.90269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867205.90324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867205.90331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867205.90333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867205.90373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867205.92282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867205.92286: stdout chunk (state=3): >>><<< 13131 1726867205.92288: stderr chunk (state=3): >>><<< 13131 1726867205.92290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867205.92295: handler run complete 13131 1726867205.92298: Evaluated conditional (False): False 13131 1726867205.92300: attempt loop complete, returning result 13131 1726867205.92302: _execute() done 13131 1726867205.92304: dumping result to json 13131 1726867205.92305: done dumping result, returning 13131 1726867205.92307: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affcac9-a3a5-5f24-9b7a-000000000677] 13131 1726867205.92309: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000677 13131 1726867205.92371: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000677 13131 1726867205.92374: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.021140", "end": "2024-09-20 17:20:05.880510", "rc": 0, "start": "2024-09-20 17:20:05.859370" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 13131 1726867205.92458: no more pending results, returning what we have 13131 1726867205.92461: results queue empty 13131 1726867205.92463: checking for any_errors_fatal 13131 1726867205.92469: done checking for any_errors_fatal 13131 1726867205.92470: checking for max_fail_percentage 13131 1726867205.92472: done checking for max_fail_percentage 13131 1726867205.92473: checking to see if all hosts have failed and the running result is not ok 13131 1726867205.92474: done checking to see if all hosts have failed 13131 1726867205.92474: getting the remaining hosts for this loop 13131 1726867205.92476: done getting the remaining hosts for this loop 13131 1726867205.92481: getting the next task for host managed_node1 13131 1726867205.92490: done getting next task for host managed_node1 13131 1726867205.92493: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13131 1726867205.92498: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867205.92501: getting variables 13131 1726867205.92503: in VariableManager get_vars() 13131 1726867205.92559: Calling all_inventory to load vars for managed_node1 13131 1726867205.92562: Calling groups_inventory to load vars for managed_node1 13131 1726867205.92564: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867205.92575: Calling all_plugins_play to load vars for managed_node1 13131 1726867205.92691: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867205.92697: Calling groups_plugins_play to load vars for managed_node1 13131 1726867205.94326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867205.95889: done with get_vars() 13131 1726867205.95910: done getting variables 13131 1726867205.95979: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:20:05 -0400 (0:00:00.425) 0:00:21.070 ****** 13131 1726867205.96009: entering _queue_task() for managed_node1/set_fact 13131 1726867205.96345: worker is 1 (out of 1 available) 13131 1726867205.96357: exiting _queue_task() for managed_node1/set_fact 13131 1726867205.96370: done queuing things up, now waiting for results queue to drain 13131 1726867205.96371: waiting for pending results... 13131 1726867205.96897: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13131 1726867205.96902: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000678 13131 1726867205.96905: variable 'ansible_search_path' from source: unknown 13131 1726867205.96907: variable 'ansible_search_path' from source: unknown 13131 1726867205.97083: calling self._execute() 13131 1726867205.97087: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.97090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.97096: variable 'omit' from source: magic vars 13131 1726867205.97299: variable 'ansible_distribution_major_version' from source: facts 13131 1726867205.97308: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867205.97444: variable 'nm_profile_exists' from source: set_fact 13131 1726867205.97466: Evaluated conditional (nm_profile_exists.rc == 0): True 13131 1726867205.97473: variable 'omit' from source: magic vars 13131 1726867205.97517: variable 'omit' from source: magic vars 13131 1726867205.97547: variable 'omit' from source: magic vars 13131 1726867205.97596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867205.97628: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867205.97648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867205.97665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867205.97685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867205.97713: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867205.97717: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.97720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.97821: Set connection var ansible_connection to ssh 13131 1726867205.97829: Set connection var ansible_timeout to 10 13131 1726867205.97832: Set connection var ansible_shell_type to sh 13131 1726867205.97840: Set connection var ansible_shell_executable to /bin/sh 13131 1726867205.97849: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867205.97855: Set connection var ansible_pipelining to False 13131 1726867205.97878: variable 'ansible_shell_executable' from source: unknown 13131 1726867205.97881: variable 'ansible_connection' from source: unknown 13131 1726867205.97884: variable 'ansible_module_compression' from source: unknown 13131 1726867205.97886: variable 'ansible_shell_type' from source: unknown 13131 1726867205.97897: variable 'ansible_shell_executable' from source: unknown 13131 1726867205.97899: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867205.97902: variable 'ansible_pipelining' from source: unknown 13131 1726867205.97904: variable 'ansible_timeout' from source: unknown 13131 1726867205.97909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867205.98048: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867205.98058: variable 'omit' from source: magic vars 13131 1726867205.98064: starting attempt loop 13131 1726867205.98068: running the handler 13131 1726867205.98080: handler run complete 13131 1726867205.98090: attempt loop complete, returning result 13131 1726867205.98095: _execute() done 13131 1726867205.98098: dumping result to json 13131 1726867205.98100: done dumping result, returning 13131 1726867205.98113: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-5f24-9b7a-000000000678] 13131 1726867205.98117: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000678 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13131 1726867205.98385: no more pending results, returning what we have 13131 1726867205.98388: results queue empty 13131 1726867205.98388: checking for any_errors_fatal 13131 1726867205.98396: done checking for any_errors_fatal 13131 1726867205.98397: checking for max_fail_percentage 13131 1726867205.98398: done checking for max_fail_percentage 13131 1726867205.98399: checking to see if all hosts have failed and the running result is not ok 13131 1726867205.98400: done checking to see if all hosts have failed 13131 1726867205.98400: getting the remaining hosts for this loop 13131 1726867205.98402: done getting the remaining hosts for this loop 13131 1726867205.98405: getting the next task for host managed_node1 13131 1726867205.98413: done getting next task for host managed_node1 13131 1726867205.98415: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13131 1726867205.98419: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867205.98422: getting variables 13131 1726867205.98424: in VariableManager get_vars() 13131 1726867205.98473: Calling all_inventory to load vars for managed_node1 13131 1726867205.98476: Calling groups_inventory to load vars for managed_node1 13131 1726867205.98481: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867205.98491: Calling all_plugins_play to load vars for managed_node1 13131 1726867205.98494: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867205.98497: Calling groups_plugins_play to load vars for managed_node1 13131 1726867205.99018: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000678 13131 1726867205.99022: WORKER PROCESS EXITING 13131 1726867205.99875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.01489: done with get_vars() 13131 1726867206.01513: done getting variables 13131 1726867206.01571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867206.01703: variable 'profile' from source: include params 13131 1726867206.01707: variable 'item' from source: include params 13131 1726867206.01784: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:20:06 -0400 (0:00:00.058) 0:00:21.128 ****** 13131 1726867206.01829: entering _queue_task() for managed_node1/command 13131 1726867206.02491: worker is 1 (out of 1 available) 13131 1726867206.02501: exiting _queue_task() for managed_node1/command 13131 1726867206.02510: done queuing things up, now waiting for results queue to drain 13131 1726867206.02511: waiting for pending results... 13131 1726867206.02639: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 13131 1726867206.02688: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000067a 13131 1726867206.02713: variable 'ansible_search_path' from source: unknown 13131 1726867206.02724: variable 'ansible_search_path' from source: unknown 13131 1726867206.02767: calling self._execute() 13131 1726867206.02869: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.02884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.02900: variable 'omit' from source: magic vars 13131 1726867206.03265: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.03288: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.03412: variable 'profile_stat' from source: set_fact 13131 1726867206.03482: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867206.03485: when evaluation is False, skipping this task 13131 1726867206.03495: _execute() done 13131 1726867206.03497: dumping result to json 13131 1726867206.03500: done dumping result, returning 13131 1726867206.03502: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affcac9-a3a5-5f24-9b7a-00000000067a] 13131 1726867206.03504: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000067a skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867206.03649: no more pending results, returning what we have 13131 1726867206.03653: results queue empty 13131 1726867206.03655: checking for any_errors_fatal 13131 1726867206.03659: done checking for any_errors_fatal 13131 1726867206.03660: checking for max_fail_percentage 13131 1726867206.03662: done checking for max_fail_percentage 13131 1726867206.03663: checking to see if all hosts have failed and the running result is not ok 13131 1726867206.03663: done checking to see if all hosts have failed 13131 1726867206.03664: getting the remaining hosts for this loop 13131 1726867206.03666: done getting the remaining hosts for this loop 13131 1726867206.03670: getting the next task for host managed_node1 13131 1726867206.03680: done getting next task for host managed_node1 13131 1726867206.03683: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13131 1726867206.03688: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867206.03695: getting variables 13131 1726867206.03697: in VariableManager get_vars() 13131 1726867206.03751: Calling all_inventory to load vars for managed_node1 13131 1726867206.03754: Calling groups_inventory to load vars for managed_node1 13131 1726867206.03756: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867206.03769: Calling all_plugins_play to load vars for managed_node1 13131 1726867206.03771: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867206.03774: Calling groups_plugins_play to load vars for managed_node1 13131 1726867206.03988: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000067a 13131 1726867206.03992: WORKER PROCESS EXITING 13131 1726867206.09254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.10808: done with get_vars() 13131 1726867206.10827: done getting variables 13131 1726867206.10872: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867206.10963: variable 'profile' from source: include params 13131 1726867206.10966: variable 'item' from source: include params 13131 1726867206.11024: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:20:06 -0400 (0:00:00.092) 0:00:21.221 ****** 13131 1726867206.11049: entering _queue_task() for managed_node1/set_fact 13131 1726867206.11383: worker is 1 (out of 1 available) 13131 1726867206.11397: exiting _queue_task() for managed_node1/set_fact 13131 1726867206.11408: done queuing things up, now waiting for results queue to drain 13131 1726867206.11409: waiting for pending results... 13131 1726867206.11689: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 13131 1726867206.11827: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000067b 13131 1726867206.11849: variable 'ansible_search_path' from source: unknown 13131 1726867206.11856: variable 'ansible_search_path' from source: unknown 13131 1726867206.11898: calling self._execute() 13131 1726867206.12006: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.12020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.12035: variable 'omit' from source: magic vars 13131 1726867206.12408: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.12425: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.12555: variable 'profile_stat' from source: set_fact 13131 1726867206.12580: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867206.12589: when evaluation is False, skipping this task 13131 1726867206.12599: _execute() done 13131 1726867206.12606: dumping result to json 13131 1726867206.12612: done dumping result, returning 13131 1726867206.12622: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affcac9-a3a5-5f24-9b7a-00000000067b] 13131 1726867206.12631: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000067b skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867206.12785: no more pending results, returning what we have 13131 1726867206.12788: results queue empty 13131 1726867206.12789: checking for any_errors_fatal 13131 1726867206.12800: done checking for any_errors_fatal 13131 1726867206.12801: checking for max_fail_percentage 13131 1726867206.12803: done checking for max_fail_percentage 13131 1726867206.12803: checking to see if all hosts have failed and the running result is not ok 13131 1726867206.12804: done checking to see if all hosts have failed 13131 1726867206.12805: getting the remaining hosts for this loop 13131 1726867206.12806: done getting the remaining hosts for this loop 13131 1726867206.12810: getting the next task for host managed_node1 13131 1726867206.12817: done getting next task for host managed_node1 13131 1726867206.12820: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13131 1726867206.12824: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867206.12828: getting variables 13131 1726867206.12830: in VariableManager get_vars() 13131 1726867206.12886: Calling all_inventory to load vars for managed_node1 13131 1726867206.12889: Calling groups_inventory to load vars for managed_node1 13131 1726867206.12892: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867206.12906: Calling all_plugins_play to load vars for managed_node1 13131 1726867206.12909: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867206.12912: Calling groups_plugins_play to load vars for managed_node1 13131 1726867206.13791: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000067b 13131 1726867206.13798: WORKER PROCESS EXITING 13131 1726867206.14489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.16168: done with get_vars() 13131 1726867206.16191: done getting variables 13131 1726867206.16250: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867206.16367: variable 'profile' from source: include params 13131 1726867206.16370: variable 'item' from source: include params 13131 1726867206.16432: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:20:06 -0400 (0:00:00.054) 0:00:21.275 ****** 13131 1726867206.16464: entering _queue_task() for managed_node1/command 13131 1726867206.16801: worker is 1 (out of 1 available) 13131 1726867206.16813: exiting _queue_task() for managed_node1/command 13131 1726867206.16826: done queuing things up, now waiting for results queue to drain 13131 1726867206.16827: waiting for pending results... 13131 1726867206.17108: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 13131 1726867206.17243: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000067c 13131 1726867206.17262: variable 'ansible_search_path' from source: unknown 13131 1726867206.17269: variable 'ansible_search_path' from source: unknown 13131 1726867206.17317: calling self._execute() 13131 1726867206.17422: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.17433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.17447: variable 'omit' from source: magic vars 13131 1726867206.17820: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.17837: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.17969: variable 'profile_stat' from source: set_fact 13131 1726867206.17992: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867206.18003: when evaluation is False, skipping this task 13131 1726867206.18064: _execute() done 13131 1726867206.18067: dumping result to json 13131 1726867206.18070: done dumping result, returning 13131 1726867206.18072: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affcac9-a3a5-5f24-9b7a-00000000067c] 13131 1726867206.18075: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000067c 13131 1726867206.18145: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000067c 13131 1726867206.18148: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867206.18221: no more pending results, returning what we have 13131 1726867206.18225: results queue empty 13131 1726867206.18226: checking for any_errors_fatal 13131 1726867206.18233: done checking for any_errors_fatal 13131 1726867206.18233: checking for max_fail_percentage 13131 1726867206.18235: done checking for max_fail_percentage 13131 1726867206.18236: checking to see if all hosts have failed and the running result is not ok 13131 1726867206.18237: done checking to see if all hosts have failed 13131 1726867206.18237: getting the remaining hosts for this loop 13131 1726867206.18239: done getting the remaining hosts for this loop 13131 1726867206.18242: getting the next task for host managed_node1 13131 1726867206.18250: done getting next task for host managed_node1 13131 1726867206.18252: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13131 1726867206.18257: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867206.18261: getting variables 13131 1726867206.18262: in VariableManager get_vars() 13131 1726867206.18319: Calling all_inventory to load vars for managed_node1 13131 1726867206.18322: Calling groups_inventory to load vars for managed_node1 13131 1726867206.18325: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867206.18339: Calling all_plugins_play to load vars for managed_node1 13131 1726867206.18342: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867206.18345: Calling groups_plugins_play to load vars for managed_node1 13131 1726867206.19868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.21532: done with get_vars() 13131 1726867206.21559: done getting variables 13131 1726867206.21620: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867206.21772: variable 'profile' from source: include params 13131 1726867206.21776: variable 'item' from source: include params 13131 1726867206.21833: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:20:06 -0400 (0:00:00.053) 0:00:21.329 ****** 13131 1726867206.21864: entering _queue_task() for managed_node1/set_fact 13131 1726867206.22187: worker is 1 (out of 1 available) 13131 1726867206.22199: exiting _queue_task() for managed_node1/set_fact 13131 1726867206.22210: done queuing things up, now waiting for results queue to drain 13131 1726867206.22212: waiting for pending results... 13131 1726867206.22599: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 13131 1726867206.22611: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000067d 13131 1726867206.22630: variable 'ansible_search_path' from source: unknown 13131 1726867206.22637: variable 'ansible_search_path' from source: unknown 13131 1726867206.22676: calling self._execute() 13131 1726867206.22784: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.22798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.22816: variable 'omit' from source: magic vars 13131 1726867206.23185: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.23202: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.23442: variable 'profile_stat' from source: set_fact 13131 1726867206.23464: Evaluated conditional (profile_stat.stat.exists): False 13131 1726867206.23471: when evaluation is False, skipping this task 13131 1726867206.23480: _execute() done 13131 1726867206.23487: dumping result to json 13131 1726867206.23495: done dumping result, returning 13131 1726867206.23505: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affcac9-a3a5-5f24-9b7a-00000000067d] 13131 1726867206.23513: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000067d skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13131 1726867206.23781: no more pending results, returning what we have 13131 1726867206.23786: results queue empty 13131 1726867206.23787: checking for any_errors_fatal 13131 1726867206.23795: done checking for any_errors_fatal 13131 1726867206.23795: checking for max_fail_percentage 13131 1726867206.23798: done checking for max_fail_percentage 13131 1726867206.23798: checking to see if all hosts have failed and the running result is not ok 13131 1726867206.23799: done checking to see if all hosts have failed 13131 1726867206.23800: getting the remaining hosts for this loop 13131 1726867206.23801: done getting the remaining hosts for this loop 13131 1726867206.23805: getting the next task for host managed_node1 13131 1726867206.23814: done getting next task for host managed_node1 13131 1726867206.23816: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13131 1726867206.23820: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867206.23825: getting variables 13131 1726867206.23827: in VariableManager get_vars() 13131 1726867206.23884: Calling all_inventory to load vars for managed_node1 13131 1726867206.23887: Calling groups_inventory to load vars for managed_node1 13131 1726867206.23889: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867206.23902: Calling all_plugins_play to load vars for managed_node1 13131 1726867206.23905: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867206.23908: Calling groups_plugins_play to load vars for managed_node1 13131 1726867206.24490: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000067d 13131 1726867206.24494: WORKER PROCESS EXITING 13131 1726867206.25553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.28041: done with get_vars() 13131 1726867206.28062: done getting variables 13131 1726867206.28121: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867206.28436: variable 'profile' from source: include params 13131 1726867206.28440: variable 'item' from source: include params 13131 1726867206.28497: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:20:06 -0400 (0:00:00.066) 0:00:21.395 ****** 13131 1726867206.28526: entering _queue_task() for managed_node1/assert 13131 1726867206.29074: worker is 1 (out of 1 available) 13131 1726867206.29098: exiting _queue_task() for managed_node1/assert 13131 1726867206.29112: done queuing things up, now waiting for results queue to drain 13131 1726867206.29113: waiting for pending results... 13131 1726867206.29349: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' 13131 1726867206.29470: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000364 13131 1726867206.29494: variable 'ansible_search_path' from source: unknown 13131 1726867206.29503: variable 'ansible_search_path' from source: unknown 13131 1726867206.29544: calling self._execute() 13131 1726867206.29652: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.29665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.29683: variable 'omit' from source: magic vars 13131 1726867206.30040: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.30057: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.30067: variable 'omit' from source: magic vars 13131 1726867206.30112: variable 'omit' from source: magic vars 13131 1726867206.30208: variable 'profile' from source: include params 13131 1726867206.30218: variable 'item' from source: include params 13131 1726867206.30280: variable 'item' from source: include params 13131 1726867206.30302: variable 'omit' from source: magic vars 13131 1726867206.30345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867206.30382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867206.30404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867206.30424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.30437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.30470: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867206.30479: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.30487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.30570: Set connection var ansible_connection to ssh 13131 1726867206.30584: Set connection var ansible_timeout to 10 13131 1726867206.30590: Set connection var ansible_shell_type to sh 13131 1726867206.30782: Set connection var ansible_shell_executable to /bin/sh 13131 1726867206.30786: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867206.30792: Set connection var ansible_pipelining to False 13131 1726867206.30795: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.30797: variable 'ansible_connection' from source: unknown 13131 1726867206.30799: variable 'ansible_module_compression' from source: unknown 13131 1726867206.30801: variable 'ansible_shell_type' from source: unknown 13131 1726867206.30803: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.30805: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.30807: variable 'ansible_pipelining' from source: unknown 13131 1726867206.30810: variable 'ansible_timeout' from source: unknown 13131 1726867206.30812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.30844: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867206.30860: variable 'omit' from source: magic vars 13131 1726867206.30870: starting attempt loop 13131 1726867206.30878: running the handler 13131 1726867206.31002: variable 'lsr_net_profile_exists' from source: set_fact 13131 1726867206.31012: Evaluated conditional (lsr_net_profile_exists): True 13131 1726867206.31022: handler run complete 13131 1726867206.31041: attempt loop complete, returning result 13131 1726867206.31048: _execute() done 13131 1726867206.31055: dumping result to json 13131 1726867206.31061: done dumping result, returning 13131 1726867206.31071: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' [0affcac9-a3a5-5f24-9b7a-000000000364] 13131 1726867206.31081: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000364 13131 1726867206.31283: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000364 13131 1726867206.31287: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867206.31334: no more pending results, returning what we have 13131 1726867206.31337: results queue empty 13131 1726867206.31338: checking for any_errors_fatal 13131 1726867206.31345: done checking for any_errors_fatal 13131 1726867206.31345: checking for max_fail_percentage 13131 1726867206.31347: done checking for max_fail_percentage 13131 1726867206.31347: checking to see if all hosts have failed and the running result is not ok 13131 1726867206.31348: done checking to see if all hosts have failed 13131 1726867206.31349: getting the remaining hosts for this loop 13131 1726867206.31350: done getting the remaining hosts for this loop 13131 1726867206.31353: getting the next task for host managed_node1 13131 1726867206.31358: done getting next task for host managed_node1 13131 1726867206.31360: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13131 1726867206.31363: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867206.31366: getting variables 13131 1726867206.31368: in VariableManager get_vars() 13131 1726867206.31420: Calling all_inventory to load vars for managed_node1 13131 1726867206.31423: Calling groups_inventory to load vars for managed_node1 13131 1726867206.31425: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867206.31434: Calling all_plugins_play to load vars for managed_node1 13131 1726867206.31436: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867206.31438: Calling groups_plugins_play to load vars for managed_node1 13131 1726867206.34170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.37321: done with get_vars() 13131 1726867206.37343: done getting variables 13131 1726867206.37399: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867206.37718: variable 'profile' from source: include params 13131 1726867206.37722: variable 'item' from source: include params 13131 1726867206.37781: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:20:06 -0400 (0:00:00.092) 0:00:21.488 ****** 13131 1726867206.37815: entering _queue_task() for managed_node1/assert 13131 1726867206.38510: worker is 1 (out of 1 available) 13131 1726867206.38520: exiting _queue_task() for managed_node1/assert 13131 1726867206.38530: done queuing things up, now waiting for results queue to drain 13131 1726867206.38531: waiting for pending results... 13131 1726867206.38895: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 13131 1726867206.38901: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000365 13131 1726867206.38904: variable 'ansible_search_path' from source: unknown 13131 1726867206.38908: variable 'ansible_search_path' from source: unknown 13131 1726867206.38927: calling self._execute() 13131 1726867206.39038: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.39045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.39054: variable 'omit' from source: magic vars 13131 1726867206.39479: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.39490: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.39681: variable 'omit' from source: magic vars 13131 1726867206.39684: variable 'omit' from source: magic vars 13131 1726867206.39687: variable 'profile' from source: include params 13131 1726867206.39689: variable 'item' from source: include params 13131 1726867206.39744: variable 'item' from source: include params 13131 1726867206.39771: variable 'omit' from source: magic vars 13131 1726867206.39817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867206.39850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867206.39875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867206.39895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.39909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.39947: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867206.39951: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.39953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.40155: Set connection var ansible_connection to ssh 13131 1726867206.40159: Set connection var ansible_timeout to 10 13131 1726867206.40161: Set connection var ansible_shell_type to sh 13131 1726867206.40163: Set connection var ansible_shell_executable to /bin/sh 13131 1726867206.40165: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867206.40168: Set connection var ansible_pipelining to False 13131 1726867206.40170: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.40171: variable 'ansible_connection' from source: unknown 13131 1726867206.40173: variable 'ansible_module_compression' from source: unknown 13131 1726867206.40175: variable 'ansible_shell_type' from source: unknown 13131 1726867206.40179: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.40181: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.40183: variable 'ansible_pipelining' from source: unknown 13131 1726867206.40186: variable 'ansible_timeout' from source: unknown 13131 1726867206.40188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.40315: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867206.40325: variable 'omit' from source: magic vars 13131 1726867206.40330: starting attempt loop 13131 1726867206.40333: running the handler 13131 1726867206.40582: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13131 1726867206.40587: Evaluated conditional (lsr_net_profile_ansible_managed): True 13131 1726867206.40589: handler run complete 13131 1726867206.40591: attempt loop complete, returning result 13131 1726867206.40593: _execute() done 13131 1726867206.40595: dumping result to json 13131 1726867206.40597: done dumping result, returning 13131 1726867206.40599: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affcac9-a3a5-5f24-9b7a-000000000365] 13131 1726867206.40601: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000365 13131 1726867206.40661: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000365 13131 1726867206.40663: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867206.40710: no more pending results, returning what we have 13131 1726867206.40712: results queue empty 13131 1726867206.40714: checking for any_errors_fatal 13131 1726867206.40719: done checking for any_errors_fatal 13131 1726867206.40720: checking for max_fail_percentage 13131 1726867206.40721: done checking for max_fail_percentage 13131 1726867206.40722: checking to see if all hosts have failed and the running result is not ok 13131 1726867206.40723: done checking to see if all hosts have failed 13131 1726867206.40723: getting the remaining hosts for this loop 13131 1726867206.40725: done getting the remaining hosts for this loop 13131 1726867206.40728: getting the next task for host managed_node1 13131 1726867206.40732: done getting next task for host managed_node1 13131 1726867206.40735: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13131 1726867206.40738: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867206.40741: getting variables 13131 1726867206.40742: in VariableManager get_vars() 13131 1726867206.40786: Calling all_inventory to load vars for managed_node1 13131 1726867206.40788: Calling groups_inventory to load vars for managed_node1 13131 1726867206.40790: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867206.40799: Calling all_plugins_play to load vars for managed_node1 13131 1726867206.40801: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867206.40803: Calling groups_plugins_play to load vars for managed_node1 13131 1726867206.43287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.44832: done with get_vars() 13131 1726867206.44854: done getting variables 13131 1726867206.44910: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867206.45017: variable 'profile' from source: include params 13131 1726867206.45021: variable 'item' from source: include params 13131 1726867206.45075: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:20:06 -0400 (0:00:00.072) 0:00:21.561 ****** 13131 1726867206.45113: entering _queue_task() for managed_node1/assert 13131 1726867206.45398: worker is 1 (out of 1 available) 13131 1726867206.45412: exiting _queue_task() for managed_node1/assert 13131 1726867206.45423: done queuing things up, now waiting for results queue to drain 13131 1726867206.45424: waiting for pending results... 13131 1726867206.45803: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 13131 1726867206.45807: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000366 13131 1726867206.45814: variable 'ansible_search_path' from source: unknown 13131 1726867206.45883: variable 'ansible_search_path' from source: unknown 13131 1726867206.45887: calling self._execute() 13131 1726867206.45965: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.45976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.45992: variable 'omit' from source: magic vars 13131 1726867206.46356: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.46372: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.46385: variable 'omit' from source: magic vars 13131 1726867206.46423: variable 'omit' from source: magic vars 13131 1726867206.46526: variable 'profile' from source: include params 13131 1726867206.46536: variable 'item' from source: include params 13131 1726867206.46605: variable 'item' from source: include params 13131 1726867206.46629: variable 'omit' from source: magic vars 13131 1726867206.46882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867206.46886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867206.46888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867206.46891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.46893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.46895: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867206.46897: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.46899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.46901: Set connection var ansible_connection to ssh 13131 1726867206.46909: Set connection var ansible_timeout to 10 13131 1726867206.46915: Set connection var ansible_shell_type to sh 13131 1726867206.46927: Set connection var ansible_shell_executable to /bin/sh 13131 1726867206.46940: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867206.46949: Set connection var ansible_pipelining to False 13131 1726867206.46972: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.46982: variable 'ansible_connection' from source: unknown 13131 1726867206.46989: variable 'ansible_module_compression' from source: unknown 13131 1726867206.46995: variable 'ansible_shell_type' from source: unknown 13131 1726867206.47002: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.47008: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.47021: variable 'ansible_pipelining' from source: unknown 13131 1726867206.47028: variable 'ansible_timeout' from source: unknown 13131 1726867206.47035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.47171: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867206.47191: variable 'omit' from source: magic vars 13131 1726867206.47201: starting attempt loop 13131 1726867206.47208: running the handler 13131 1726867206.47319: variable 'lsr_net_profile_fingerprint' from source: set_fact 13131 1726867206.47329: Evaluated conditional (lsr_net_profile_fingerprint): True 13131 1726867206.47342: handler run complete 13131 1726867206.47359: attempt loop complete, returning result 13131 1726867206.47366: _execute() done 13131 1726867206.47374: dumping result to json 13131 1726867206.47383: done dumping result, returning 13131 1726867206.47393: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 [0affcac9-a3a5-5f24-9b7a-000000000366] 13131 1726867206.47401: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000366 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867206.47666: no more pending results, returning what we have 13131 1726867206.47669: results queue empty 13131 1726867206.47670: checking for any_errors_fatal 13131 1726867206.47678: done checking for any_errors_fatal 13131 1726867206.47679: checking for max_fail_percentage 13131 1726867206.47681: done checking for max_fail_percentage 13131 1726867206.47681: checking to see if all hosts have failed and the running result is not ok 13131 1726867206.47682: done checking to see if all hosts have failed 13131 1726867206.47683: getting the remaining hosts for this loop 13131 1726867206.47685: done getting the remaining hosts for this loop 13131 1726867206.47688: getting the next task for host managed_node1 13131 1726867206.47695: done getting next task for host managed_node1 13131 1726867206.47697: ^ task is: TASK: ** TEST check polling interval 13131 1726867206.47700: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867206.47704: getting variables 13131 1726867206.47705: in VariableManager get_vars() 13131 1726867206.47754: Calling all_inventory to load vars for managed_node1 13131 1726867206.47756: Calling groups_inventory to load vars for managed_node1 13131 1726867206.47759: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867206.47768: Calling all_plugins_play to load vars for managed_node1 13131 1726867206.47771: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867206.47774: Calling groups_plugins_play to load vars for managed_node1 13131 1726867206.47787: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000366 13131 1726867206.47790: WORKER PROCESS EXITING 13131 1726867206.49142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.51023: done with get_vars() 13131 1726867206.51053: done getting variables 13131 1726867206.51118: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Friday 20 September 2024 17:20:06 -0400 (0:00:00.060) 0:00:21.622 ****** 13131 1726867206.51155: entering _queue_task() for managed_node1/command 13131 1726867206.51611: worker is 1 (out of 1 available) 13131 1726867206.51623: exiting _queue_task() for managed_node1/command 13131 1726867206.51632: done queuing things up, now waiting for results queue to drain 13131 1726867206.51646: waiting for pending results... 13131 1726867206.51851: running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval 13131 1726867206.52208: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000071 13131 1726867206.52213: variable 'ansible_search_path' from source: unknown 13131 1726867206.52216: calling self._execute() 13131 1726867206.52219: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.52222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.52225: variable 'omit' from source: magic vars 13131 1726867206.52686: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.52689: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.52692: variable 'omit' from source: magic vars 13131 1726867206.52694: variable 'omit' from source: magic vars 13131 1726867206.52793: variable 'controller_device' from source: play vars 13131 1726867206.52796: variable 'omit' from source: magic vars 13131 1726867206.52799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867206.52982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867206.52985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867206.52988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.52991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.52993: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867206.52996: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.52998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.53024: Set connection var ansible_connection to ssh 13131 1726867206.53031: Set connection var ansible_timeout to 10 13131 1726867206.53034: Set connection var ansible_shell_type to sh 13131 1726867206.53041: Set connection var ansible_shell_executable to /bin/sh 13131 1726867206.53050: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867206.53062: Set connection var ansible_pipelining to False 13131 1726867206.53083: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.53086: variable 'ansible_connection' from source: unknown 13131 1726867206.53089: variable 'ansible_module_compression' from source: unknown 13131 1726867206.53092: variable 'ansible_shell_type' from source: unknown 13131 1726867206.53094: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.53097: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.53103: variable 'ansible_pipelining' from source: unknown 13131 1726867206.53105: variable 'ansible_timeout' from source: unknown 13131 1726867206.53109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.53240: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867206.53482: variable 'omit' from source: magic vars 13131 1726867206.53486: starting attempt loop 13131 1726867206.53489: running the handler 13131 1726867206.53491: _low_level_execute_command(): starting 13131 1726867206.53493: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867206.54399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867206.54442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867206.54499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867206.54521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867206.54784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867206.56369: stdout chunk (state=3): >>>/root <<< 13131 1726867206.56532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867206.56536: stdout chunk (state=3): >>><<< 13131 1726867206.56545: stderr chunk (state=3): >>><<< 13131 1726867206.56682: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867206.56696: _low_level_execute_command(): starting 13131 1726867206.56707: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007 `" && echo ansible-tmp-1726867206.5668235-14288-116980801033007="` echo /root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007 `" ) && sleep 0' 13131 1726867206.57996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867206.58035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867206.58231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867206.60097: stdout chunk (state=3): >>>ansible-tmp-1726867206.5668235-14288-116980801033007=/root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007 <<< 13131 1726867206.60238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867206.60248: stdout chunk (state=3): >>><<< 13131 1726867206.60259: stderr chunk (state=3): >>><<< 13131 1726867206.60284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867206.5668235-14288-116980801033007=/root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867206.60327: variable 'ansible_module_compression' from source: unknown 13131 1726867206.60536: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867206.60584: variable 'ansible_facts' from source: unknown 13131 1726867206.60664: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/AnsiballZ_command.py 13131 1726867206.61166: Sending initial data 13131 1726867206.61169: Sent initial data (156 bytes) 13131 1726867206.61851: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867206.61864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867206.61964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867206.61999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867206.62072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867206.63635: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867206.63681: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867206.63728: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp4l9eszc0 /root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/AnsiballZ_command.py <<< 13131 1726867206.63737: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/AnsiballZ_command.py" <<< 13131 1726867206.63765: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp4l9eszc0" to remote "/root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/AnsiballZ_command.py" <<< 13131 1726867206.65092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867206.65108: stderr chunk (state=3): >>><<< 13131 1726867206.65249: stdout chunk (state=3): >>><<< 13131 1726867206.65252: done transferring module to remote 13131 1726867206.65254: _low_level_execute_command(): starting 13131 1726867206.65257: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/ /root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/AnsiballZ_command.py && sleep 0' 13131 1726867206.65762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867206.65780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867206.65884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867206.65904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867206.65985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867206.67982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867206.67986: stdout chunk (state=3): >>><<< 13131 1726867206.67988: stderr chunk (state=3): >>><<< 13131 1726867206.68008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867206.68027: _low_level_execute_command(): starting 13131 1726867206.68037: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/AnsiballZ_command.py && sleep 0' 13131 1726867206.69100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867206.69123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867206.69137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867206.69147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867206.69232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867206.84888: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 17:20:06.842331", "end": "2024-09-20 17:20:06.845777", "delta": "0:00:00.003446", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867206.86300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867206.86533: stdout chunk (state=3): >>><<< 13131 1726867206.86539: stderr chunk (state=3): >>><<< 13131 1726867206.86559: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 17:20:06.842331", "end": "2024-09-20 17:20:06.845777", "delta": "0:00:00.003446", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867206.86601: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867206.86605: _low_level_execute_command(): starting 13131 1726867206.86611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867206.5668235-14288-116980801033007/ > /dev/null 2>&1 && sleep 0' 13131 1726867206.87194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867206.87204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867206.87214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867206.87235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867206.87249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867206.87256: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867206.87266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867206.87281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867206.87295: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867206.87303: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867206.87312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867206.87322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867206.87338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867206.87346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867206.87390: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867206.87425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867206.87451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867206.87470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867206.87528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867206.89583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867206.89587: stdout chunk (state=3): >>><<< 13131 1726867206.89589: stderr chunk (state=3): >>><<< 13131 1726867206.89591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867206.89598: handler run complete 13131 1726867206.89604: Evaluated conditional (False): False 13131 1726867206.89614: variable 'result' from source: unknown 13131 1726867206.89633: Evaluated conditional ('110' in result.stdout): True 13131 1726867206.89644: attempt loop complete, returning result 13131 1726867206.89647: _execute() done 13131 1726867206.89650: dumping result to json 13131 1726867206.89656: done dumping result, returning 13131 1726867206.89664: done running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval [0affcac9-a3a5-5f24-9b7a-000000000071] 13131 1726867206.89667: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000071 13131 1726867206.89773: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000071 13131 1726867206.89778: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003446", "end": "2024-09-20 17:20:06.845777", "rc": 0, "start": "2024-09-20 17:20:06.842331" } STDOUT: MII Polling Interval (ms): 110 13131 1726867206.89862: no more pending results, returning what we have 13131 1726867206.89866: results queue empty 13131 1726867206.89867: checking for any_errors_fatal 13131 1726867206.89872: done checking for any_errors_fatal 13131 1726867206.89873: checking for max_fail_percentage 13131 1726867206.89875: done checking for max_fail_percentage 13131 1726867206.89876: checking to see if all hosts have failed and the running result is not ok 13131 1726867206.89879: done checking to see if all hosts have failed 13131 1726867206.89879: getting the remaining hosts for this loop 13131 1726867206.89881: done getting the remaining hosts for this loop 13131 1726867206.89885: getting the next task for host managed_node1 13131 1726867206.89891: done getting next task for host managed_node1 13131 1726867206.89894: ^ task is: TASK: ** TEST check IPv4 13131 1726867206.89896: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867206.89900: getting variables 13131 1726867206.89902: in VariableManager get_vars() 13131 1726867206.89961: Calling all_inventory to load vars for managed_node1 13131 1726867206.89964: Calling groups_inventory to load vars for managed_node1 13131 1726867206.89967: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867206.90262: Calling all_plugins_play to load vars for managed_node1 13131 1726867206.90268: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867206.90273: Calling groups_plugins_play to load vars for managed_node1 13131 1726867206.91793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867206.93374: done with get_vars() 13131 1726867206.93398: done getting variables 13131 1726867206.93466: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Friday 20 September 2024 17:20:06 -0400 (0:00:00.423) 0:00:22.045 ****** 13131 1726867206.93496: entering _queue_task() for managed_node1/command 13131 1726867206.93910: worker is 1 (out of 1 available) 13131 1726867206.93921: exiting _queue_task() for managed_node1/command 13131 1726867206.93931: done queuing things up, now waiting for results queue to drain 13131 1726867206.93932: waiting for pending results... 13131 1726867206.94120: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 13131 1726867206.94202: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000072 13131 1726867206.94213: variable 'ansible_search_path' from source: unknown 13131 1726867206.94249: calling self._execute() 13131 1726867206.94350: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.94357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.94366: variable 'omit' from source: magic vars 13131 1726867206.94983: variable 'ansible_distribution_major_version' from source: facts 13131 1726867206.94986: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867206.94989: variable 'omit' from source: magic vars 13131 1726867206.94991: variable 'omit' from source: magic vars 13131 1726867206.94993: variable 'controller_device' from source: play vars 13131 1726867206.94995: variable 'omit' from source: magic vars 13131 1726867206.94997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867206.95003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867206.95009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867206.95033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.95045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867206.95075: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867206.95079: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.95082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.95482: Set connection var ansible_connection to ssh 13131 1726867206.95485: Set connection var ansible_timeout to 10 13131 1726867206.95487: Set connection var ansible_shell_type to sh 13131 1726867206.95490: Set connection var ansible_shell_executable to /bin/sh 13131 1726867206.95492: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867206.95494: Set connection var ansible_pipelining to False 13131 1726867206.95496: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.95498: variable 'ansible_connection' from source: unknown 13131 1726867206.95503: variable 'ansible_module_compression' from source: unknown 13131 1726867206.95505: variable 'ansible_shell_type' from source: unknown 13131 1726867206.95508: variable 'ansible_shell_executable' from source: unknown 13131 1726867206.95510: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867206.95512: variable 'ansible_pipelining' from source: unknown 13131 1726867206.95514: variable 'ansible_timeout' from source: unknown 13131 1726867206.95516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867206.95519: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867206.95521: variable 'omit' from source: magic vars 13131 1726867206.95523: starting attempt loop 13131 1726867206.95525: running the handler 13131 1726867206.95527: _low_level_execute_command(): starting 13131 1726867206.95529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867206.96196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867206.96221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867206.96239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867206.96259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867206.96330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867206.97997: stdout chunk (state=3): >>>/root <<< 13131 1726867206.98102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867206.98135: stderr chunk (state=3): >>><<< 13131 1726867206.98139: stdout chunk (state=3): >>><<< 13131 1726867206.98163: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867206.98181: _low_level_execute_command(): starting 13131 1726867206.98185: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509 `" && echo ansible-tmp-1726867206.9816327-14316-221671031964509="` echo /root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509 `" ) && sleep 0' 13131 1726867206.98764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867206.98773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867206.98786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867206.98804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867206.98812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867206.98820: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867206.98830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867206.98845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867206.98855: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867206.98862: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867206.98870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867206.98880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867206.98905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867206.98908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867206.98916: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867206.98926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867206.98992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867206.99005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867206.99023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867206.99093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.00993: stdout chunk (state=3): >>>ansible-tmp-1726867206.9816327-14316-221671031964509=/root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509 <<< 13131 1726867207.01137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.01140: stdout chunk (state=3): >>><<< 13131 1726867207.01145: stderr chunk (state=3): >>><<< 13131 1726867207.01383: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867206.9816327-14316-221671031964509=/root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.01387: variable 'ansible_module_compression' from source: unknown 13131 1726867207.01389: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867207.01391: variable 'ansible_facts' from source: unknown 13131 1726867207.01394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/AnsiballZ_command.py 13131 1726867207.01529: Sending initial data 13131 1726867207.01539: Sent initial data (156 bytes) 13131 1726867207.02195: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867207.02211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867207.02227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.02288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.02349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.02366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.02397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.02489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.04015: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867207.04088: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867207.04164: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp7syuae2z /root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/AnsiballZ_command.py <<< 13131 1726867207.04194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/AnsiballZ_command.py" <<< 13131 1726867207.04228: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp7syuae2z" to remote "/root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/AnsiballZ_command.py" <<< 13131 1726867207.05034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.05052: stderr chunk (state=3): >>><<< 13131 1726867207.05061: stdout chunk (state=3): >>><<< 13131 1726867207.05141: done transferring module to remote 13131 1726867207.05146: _low_level_execute_command(): starting 13131 1726867207.05148: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/ /root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/AnsiballZ_command.py && sleep 0' 13131 1726867207.05788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867207.05806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.05863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.05881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.05917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.05982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.07983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.07986: stdout chunk (state=3): >>><<< 13131 1726867207.07988: stderr chunk (state=3): >>><<< 13131 1726867207.07991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.07993: _low_level_execute_command(): starting 13131 1726867207.07995: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/AnsiballZ_command.py && sleep 0' 13131 1726867207.08353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867207.08362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867207.08372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.08389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867207.08401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867207.08411: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867207.08427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.08441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867207.08449: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867207.08456: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867207.08464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867207.08473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.08490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867207.08542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.08575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.08588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.08607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.08685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.24046: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.50/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:07.234843", "end": "2024-09-20 17:20:07.238544", "delta": "0:00:00.003701", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867207.25639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867207.25643: stdout chunk (state=3): >>><<< 13131 1726867207.25883: stderr chunk (state=3): >>><<< 13131 1726867207.25888: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.50/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:07.234843", "end": "2024-09-20 17:20:07.238544", "delta": "0:00:00.003701", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867207.25891: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867207.25899: _low_level_execute_command(): starting 13131 1726867207.25901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867206.9816327-14316-221671031964509/ > /dev/null 2>&1 && sleep 0' 13131 1726867207.26386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867207.26395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867207.26410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.26424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867207.26445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867207.26453: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867207.26463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.26480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867207.26490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867207.26558: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.26591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.26612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.26624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.26696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.28542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.28545: stdout chunk (state=3): >>><<< 13131 1726867207.28781: stderr chunk (state=3): >>><<< 13131 1726867207.28785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.28787: handler run complete 13131 1726867207.28790: Evaluated conditional (False): False 13131 1726867207.28792: variable 'result' from source: set_fact 13131 1726867207.28815: Evaluated conditional ('192.0.2' in result.stdout): True 13131 1726867207.28826: attempt loop complete, returning result 13131 1726867207.28829: _execute() done 13131 1726867207.28831: dumping result to json 13131 1726867207.28838: done dumping result, returning 13131 1726867207.28847: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 [0affcac9-a3a5-5f24-9b7a-000000000072] 13131 1726867207.28850: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000072 13131 1726867207.28962: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000072 13131 1726867207.28965: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003701", "end": "2024-09-20 17:20:07.238544", "rc": 0, "start": "2024-09-20 17:20:07.234843" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.50/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 236sec preferred_lft 236sec 13131 1726867207.29071: no more pending results, returning what we have 13131 1726867207.29076: results queue empty 13131 1726867207.29207: checking for any_errors_fatal 13131 1726867207.29218: done checking for any_errors_fatal 13131 1726867207.29219: checking for max_fail_percentage 13131 1726867207.29221: done checking for max_fail_percentage 13131 1726867207.29222: checking to see if all hosts have failed and the running result is not ok 13131 1726867207.29223: done checking to see if all hosts have failed 13131 1726867207.29223: getting the remaining hosts for this loop 13131 1726867207.29225: done getting the remaining hosts for this loop 13131 1726867207.29229: getting the next task for host managed_node1 13131 1726867207.29235: done getting next task for host managed_node1 13131 1726867207.29238: ^ task is: TASK: ** TEST check IPv6 13131 1726867207.29240: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867207.29243: getting variables 13131 1726867207.29246: in VariableManager get_vars() 13131 1726867207.29419: Calling all_inventory to load vars for managed_node1 13131 1726867207.29421: Calling groups_inventory to load vars for managed_node1 13131 1726867207.29424: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867207.29433: Calling all_plugins_play to load vars for managed_node1 13131 1726867207.29436: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867207.29439: Calling groups_plugins_play to load vars for managed_node1 13131 1726867207.31113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867207.33116: done with get_vars() 13131 1726867207.33137: done getting variables 13131 1726867207.33273: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Friday 20 September 2024 17:20:07 -0400 (0:00:00.398) 0:00:22.443 ****** 13131 1726867207.33302: entering _queue_task() for managed_node1/command 13131 1726867207.33543: worker is 1 (out of 1 available) 13131 1726867207.33556: exiting _queue_task() for managed_node1/command 13131 1726867207.33569: done queuing things up, now waiting for results queue to drain 13131 1726867207.33570: waiting for pending results... 13131 1726867207.33749: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 13131 1726867207.33814: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000073 13131 1726867207.33826: variable 'ansible_search_path' from source: unknown 13131 1726867207.33855: calling self._execute() 13131 1726867207.33933: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.33937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.33946: variable 'omit' from source: magic vars 13131 1726867207.34222: variable 'ansible_distribution_major_version' from source: facts 13131 1726867207.34231: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867207.34238: variable 'omit' from source: magic vars 13131 1726867207.34253: variable 'omit' from source: magic vars 13131 1726867207.34326: variable 'controller_device' from source: play vars 13131 1726867207.34341: variable 'omit' from source: magic vars 13131 1726867207.34373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867207.34401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867207.34421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867207.34433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867207.34445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867207.34468: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867207.34471: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.34474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.34545: Set connection var ansible_connection to ssh 13131 1726867207.34552: Set connection var ansible_timeout to 10 13131 1726867207.34555: Set connection var ansible_shell_type to sh 13131 1726867207.34561: Set connection var ansible_shell_executable to /bin/sh 13131 1726867207.34569: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867207.34574: Set connection var ansible_pipelining to False 13131 1726867207.34592: variable 'ansible_shell_executable' from source: unknown 13131 1726867207.34594: variable 'ansible_connection' from source: unknown 13131 1726867207.34598: variable 'ansible_module_compression' from source: unknown 13131 1726867207.34600: variable 'ansible_shell_type' from source: unknown 13131 1726867207.34604: variable 'ansible_shell_executable' from source: unknown 13131 1726867207.34606: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.34611: variable 'ansible_pipelining' from source: unknown 13131 1726867207.34614: variable 'ansible_timeout' from source: unknown 13131 1726867207.34618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.34720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867207.34729: variable 'omit' from source: magic vars 13131 1726867207.34734: starting attempt loop 13131 1726867207.34736: running the handler 13131 1726867207.34753: _low_level_execute_command(): starting 13131 1726867207.34759: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867207.35388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867207.35394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.35531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.35536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.35592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.37230: stdout chunk (state=3): >>>/root <<< 13131 1726867207.37368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.37370: stdout chunk (state=3): >>><<< 13131 1726867207.37372: stderr chunk (state=3): >>><<< 13131 1726867207.37390: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.37408: _low_level_execute_command(): starting 13131 1726867207.37469: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441 `" && echo ansible-tmp-1726867207.3739467-14330-271275379058441="` echo /root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441 `" ) && sleep 0' 13131 1726867207.37875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.37881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.37884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.37894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.37945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.37978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.38042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.39906: stdout chunk (state=3): >>>ansible-tmp-1726867207.3739467-14330-271275379058441=/root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441 <<< 13131 1726867207.40183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.40186: stdout chunk (state=3): >>><<< 13131 1726867207.40189: stderr chunk (state=3): >>><<< 13131 1726867207.40191: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867207.3739467-14330-271275379058441=/root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.40194: variable 'ansible_module_compression' from source: unknown 13131 1726867207.40196: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867207.40198: variable 'ansible_facts' from source: unknown 13131 1726867207.40271: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/AnsiballZ_command.py 13131 1726867207.40409: Sending initial data 13131 1726867207.40413: Sent initial data (156 bytes) 13131 1726867207.40994: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.41057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.41098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.41108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.41124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.41193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.42719: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867207.42754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867207.42806: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp57bx2_sk /root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/AnsiballZ_command.py <<< 13131 1726867207.42809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/AnsiballZ_command.py" <<< 13131 1726867207.42853: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp57bx2_sk" to remote "/root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/AnsiballZ_command.py" <<< 13131 1726867207.43739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.43742: stdout chunk (state=3): >>><<< 13131 1726867207.43745: stderr chunk (state=3): >>><<< 13131 1726867207.43747: done transferring module to remote 13131 1726867207.43749: _low_level_execute_command(): starting 13131 1726867207.43751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/ /root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/AnsiballZ_command.py && sleep 0' 13131 1726867207.44310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.44327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.44399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.46122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.46173: stderr chunk (state=3): >>><<< 13131 1726867207.46270: stdout chunk (state=3): >>><<< 13131 1726867207.46276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.46280: _low_level_execute_command(): starting 13131 1726867207.46282: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/AnsiballZ_command.py && sleep 0' 13131 1726867207.46888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.46912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.46928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.47015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.62553: stdout chunk (state=3): >>> <<< 13131 1726867207.62558: stdout chunk (state=3): >>>{"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1a/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::b4e9:1dff:fe8b:1945/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::b4e9:1dff:fe8b:1945/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:07.620125", "end": "2024-09-20 17:20:07.623837", "delta": "0:00:00.003712", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867207.64190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867207.64194: stdout chunk (state=3): >>><<< 13131 1726867207.64196: stderr chunk (state=3): >>><<< 13131 1726867207.64198: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1a/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::b4e9:1dff:fe8b:1945/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::b4e9:1dff:fe8b:1945/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:07.620125", "end": "2024-09-20 17:20:07.623837", "delta": "0:00:00.003712", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867207.64201: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867207.64204: _low_level_execute_command(): starting 13131 1726867207.64206: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867207.3739467-14330-271275379058441/ > /dev/null 2>&1 && sleep 0' 13131 1726867207.64733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.64736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867207.64757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.64786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.64789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.64832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.64836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.64858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.64904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.66705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.66731: stderr chunk (state=3): >>><<< 13131 1726867207.66734: stdout chunk (state=3): >>><<< 13131 1726867207.66746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.66751: handler run complete 13131 1726867207.66769: Evaluated conditional (False): False 13131 1726867207.66881: variable 'result' from source: set_fact 13131 1726867207.66894: Evaluated conditional ('2001' in result.stdout): True 13131 1726867207.66905: attempt loop complete, returning result 13131 1726867207.66908: _execute() done 13131 1726867207.66910: dumping result to json 13131 1726867207.66915: done dumping result, returning 13131 1726867207.66923: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 [0affcac9-a3a5-5f24-9b7a-000000000073] 13131 1726867207.66926: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000073 13131 1726867207.67026: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000073 13131 1726867207.67029: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003712", "end": "2024-09-20 17:20:07.623837", "rc": 0, "start": "2024-09-20 17:20:07.620125" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1a/128 scope global dynamic noprefixroute valid_lft 237sec preferred_lft 237sec inet6 2001:db8::b4e9:1dff:fe8b:1945/64 scope global dynamic noprefixroute valid_lft 1796sec preferred_lft 1796sec inet6 fe80::b4e9:1dff:fe8b:1945/64 scope link noprefixroute valid_lft forever preferred_lft forever 13131 1726867207.67104: no more pending results, returning what we have 13131 1726867207.67108: results queue empty 13131 1726867207.67109: checking for any_errors_fatal 13131 1726867207.67117: done checking for any_errors_fatal 13131 1726867207.67117: checking for max_fail_percentage 13131 1726867207.67119: done checking for max_fail_percentage 13131 1726867207.67120: checking to see if all hosts have failed and the running result is not ok 13131 1726867207.67121: done checking to see if all hosts have failed 13131 1726867207.67121: getting the remaining hosts for this loop 13131 1726867207.67123: done getting the remaining hosts for this loop 13131 1726867207.67126: getting the next task for host managed_node1 13131 1726867207.67133: done getting next task for host managed_node1 13131 1726867207.67138: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867207.67141: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867207.67157: getting variables 13131 1726867207.67158: in VariableManager get_vars() 13131 1726867207.67214: Calling all_inventory to load vars for managed_node1 13131 1726867207.67216: Calling groups_inventory to load vars for managed_node1 13131 1726867207.67218: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867207.67227: Calling all_plugins_play to load vars for managed_node1 13131 1726867207.67230: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867207.67232: Calling groups_plugins_play to load vars for managed_node1 13131 1726867207.68035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867207.68909: done with get_vars() 13131 1726867207.68927: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:20:07 -0400 (0:00:00.356) 0:00:22.800 ****** 13131 1726867207.68994: entering _queue_task() for managed_node1/include_tasks 13131 1726867207.69210: worker is 1 (out of 1 available) 13131 1726867207.69223: exiting _queue_task() for managed_node1/include_tasks 13131 1726867207.69234: done queuing things up, now waiting for results queue to drain 13131 1726867207.69236: waiting for pending results... 13131 1726867207.69409: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867207.69501: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000007b 13131 1726867207.69512: variable 'ansible_search_path' from source: unknown 13131 1726867207.69516: variable 'ansible_search_path' from source: unknown 13131 1726867207.69543: calling self._execute() 13131 1726867207.69616: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.69622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.69630: variable 'omit' from source: magic vars 13131 1726867207.69906: variable 'ansible_distribution_major_version' from source: facts 13131 1726867207.69909: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867207.69916: _execute() done 13131 1726867207.69919: dumping result to json 13131 1726867207.69921: done dumping result, returning 13131 1726867207.69928: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-5f24-9b7a-00000000007b] 13131 1726867207.69931: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007b 13131 1726867207.70020: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007b 13131 1726867207.70023: WORKER PROCESS EXITING 13131 1726867207.70064: no more pending results, returning what we have 13131 1726867207.70068: in VariableManager get_vars() 13131 1726867207.70125: Calling all_inventory to load vars for managed_node1 13131 1726867207.70127: Calling groups_inventory to load vars for managed_node1 13131 1726867207.70129: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867207.70138: Calling all_plugins_play to load vars for managed_node1 13131 1726867207.70140: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867207.70143: Calling groups_plugins_play to load vars for managed_node1 13131 1726867207.71032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867207.71873: done with get_vars() 13131 1726867207.71889: variable 'ansible_search_path' from source: unknown 13131 1726867207.71890: variable 'ansible_search_path' from source: unknown 13131 1726867207.71918: we have included files to process 13131 1726867207.71919: generating all_blocks data 13131 1726867207.71921: done generating all_blocks data 13131 1726867207.71924: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867207.71925: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867207.71926: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867207.72297: done processing included file 13131 1726867207.72299: iterating over new_blocks loaded from include file 13131 1726867207.72302: in VariableManager get_vars() 13131 1726867207.72321: done with get_vars() 13131 1726867207.72322: filtering new block on tags 13131 1726867207.72332: done filtering new block on tags 13131 1726867207.72334: in VariableManager get_vars() 13131 1726867207.72349: done with get_vars() 13131 1726867207.72350: filtering new block on tags 13131 1726867207.72364: done filtering new block on tags 13131 1726867207.72366: in VariableManager get_vars() 13131 1726867207.72386: done with get_vars() 13131 1726867207.72387: filtering new block on tags 13131 1726867207.72397: done filtering new block on tags 13131 1726867207.72398: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 13131 1726867207.72404: extending task lists for all hosts with included blocks 13131 1726867207.72854: done extending task lists 13131 1726867207.72855: done processing included files 13131 1726867207.72855: results queue empty 13131 1726867207.72856: checking for any_errors_fatal 13131 1726867207.72859: done checking for any_errors_fatal 13131 1726867207.72859: checking for max_fail_percentage 13131 1726867207.72860: done checking for max_fail_percentage 13131 1726867207.72860: checking to see if all hosts have failed and the running result is not ok 13131 1726867207.72861: done checking to see if all hosts have failed 13131 1726867207.72861: getting the remaining hosts for this loop 13131 1726867207.72862: done getting the remaining hosts for this loop 13131 1726867207.72863: getting the next task for host managed_node1 13131 1726867207.72866: done getting next task for host managed_node1 13131 1726867207.72868: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867207.72870: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867207.72876: getting variables 13131 1726867207.72879: in VariableManager get_vars() 13131 1726867207.72892: Calling all_inventory to load vars for managed_node1 13131 1726867207.72893: Calling groups_inventory to load vars for managed_node1 13131 1726867207.72895: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867207.72902: Calling all_plugins_play to load vars for managed_node1 13131 1726867207.72904: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867207.72906: Calling groups_plugins_play to load vars for managed_node1 13131 1726867207.73556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867207.74399: done with get_vars() 13131 1726867207.74414: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:20:07 -0400 (0:00:00.054) 0:00:22.855 ****** 13131 1726867207.74467: entering _queue_task() for managed_node1/setup 13131 1726867207.74726: worker is 1 (out of 1 available) 13131 1726867207.74742: exiting _queue_task() for managed_node1/setup 13131 1726867207.74751: done queuing things up, now waiting for results queue to drain 13131 1726867207.74753: waiting for pending results... 13131 1726867207.74931: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867207.75028: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000006c5 13131 1726867207.75040: variable 'ansible_search_path' from source: unknown 13131 1726867207.75043: variable 'ansible_search_path' from source: unknown 13131 1726867207.75071: calling self._execute() 13131 1726867207.75145: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.75149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.75159: variable 'omit' from source: magic vars 13131 1726867207.75426: variable 'ansible_distribution_major_version' from source: facts 13131 1726867207.75435: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867207.75578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867207.77067: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867207.77121: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867207.77151: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867207.77181: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867207.77203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867207.77258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867207.77284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867207.77304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867207.77328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867207.77339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867207.77380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867207.77397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867207.77415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867207.77439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867207.77450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867207.77552: variable '__network_required_facts' from source: role '' defaults 13131 1726867207.77560: variable 'ansible_facts' from source: unknown 13131 1726867207.77973: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13131 1726867207.77978: when evaluation is False, skipping this task 13131 1726867207.77981: _execute() done 13131 1726867207.77984: dumping result to json 13131 1726867207.77986: done dumping result, returning 13131 1726867207.77993: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-5f24-9b7a-0000000006c5] 13131 1726867207.77995: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006c5 13131 1726867207.78075: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006c5 13131 1726867207.78080: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867207.78163: no more pending results, returning what we have 13131 1726867207.78166: results queue empty 13131 1726867207.78167: checking for any_errors_fatal 13131 1726867207.78168: done checking for any_errors_fatal 13131 1726867207.78169: checking for max_fail_percentage 13131 1726867207.78170: done checking for max_fail_percentage 13131 1726867207.78171: checking to see if all hosts have failed and the running result is not ok 13131 1726867207.78172: done checking to see if all hosts have failed 13131 1726867207.78172: getting the remaining hosts for this loop 13131 1726867207.78173: done getting the remaining hosts for this loop 13131 1726867207.78176: getting the next task for host managed_node1 13131 1726867207.78187: done getting next task for host managed_node1 13131 1726867207.78192: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867207.78196: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867207.78213: getting variables 13131 1726867207.78215: in VariableManager get_vars() 13131 1726867207.78258: Calling all_inventory to load vars for managed_node1 13131 1726867207.78260: Calling groups_inventory to load vars for managed_node1 13131 1726867207.78263: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867207.78271: Calling all_plugins_play to load vars for managed_node1 13131 1726867207.78273: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867207.78275: Calling groups_plugins_play to load vars for managed_node1 13131 1726867207.79111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867207.79979: done with get_vars() 13131 1726867207.79994: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:20:07 -0400 (0:00:00.055) 0:00:22.911 ****** 13131 1726867207.80070: entering _queue_task() for managed_node1/stat 13131 1726867207.80310: worker is 1 (out of 1 available) 13131 1726867207.80326: exiting _queue_task() for managed_node1/stat 13131 1726867207.80337: done queuing things up, now waiting for results queue to drain 13131 1726867207.80338: waiting for pending results... 13131 1726867207.80522: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867207.80622: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000006c7 13131 1726867207.80636: variable 'ansible_search_path' from source: unknown 13131 1726867207.80639: variable 'ansible_search_path' from source: unknown 13131 1726867207.80668: calling self._execute() 13131 1726867207.80740: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.80744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.80753: variable 'omit' from source: magic vars 13131 1726867207.81032: variable 'ansible_distribution_major_version' from source: facts 13131 1726867207.81041: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867207.81163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867207.81358: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867207.81392: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867207.81419: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867207.81445: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867207.81507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867207.81525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867207.81544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867207.81563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867207.81627: variable '__network_is_ostree' from source: set_fact 13131 1726867207.81632: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867207.81635: when evaluation is False, skipping this task 13131 1726867207.81637: _execute() done 13131 1726867207.81640: dumping result to json 13131 1726867207.81643: done dumping result, returning 13131 1726867207.81651: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-5f24-9b7a-0000000006c7] 13131 1726867207.81655: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006c7 13131 1726867207.81734: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006c7 13131 1726867207.81737: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867207.81816: no more pending results, returning what we have 13131 1726867207.81820: results queue empty 13131 1726867207.81821: checking for any_errors_fatal 13131 1726867207.81825: done checking for any_errors_fatal 13131 1726867207.81826: checking for max_fail_percentage 13131 1726867207.81827: done checking for max_fail_percentage 13131 1726867207.81827: checking to see if all hosts have failed and the running result is not ok 13131 1726867207.81828: done checking to see if all hosts have failed 13131 1726867207.81829: getting the remaining hosts for this loop 13131 1726867207.81830: done getting the remaining hosts for this loop 13131 1726867207.81833: getting the next task for host managed_node1 13131 1726867207.81839: done getting next task for host managed_node1 13131 1726867207.81842: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867207.81845: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867207.81859: getting variables 13131 1726867207.81860: in VariableManager get_vars() 13131 1726867207.81904: Calling all_inventory to load vars for managed_node1 13131 1726867207.81907: Calling groups_inventory to load vars for managed_node1 13131 1726867207.81909: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867207.81917: Calling all_plugins_play to load vars for managed_node1 13131 1726867207.81919: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867207.81921: Calling groups_plugins_play to load vars for managed_node1 13131 1726867207.82648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867207.83591: done with get_vars() 13131 1726867207.83605: done getting variables 13131 1726867207.83642: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:20:07 -0400 (0:00:00.035) 0:00:22.947 ****** 13131 1726867207.83666: entering _queue_task() for managed_node1/set_fact 13131 1726867207.83871: worker is 1 (out of 1 available) 13131 1726867207.83886: exiting _queue_task() for managed_node1/set_fact 13131 1726867207.83898: done queuing things up, now waiting for results queue to drain 13131 1726867207.83899: waiting for pending results... 13131 1726867207.84067: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867207.84164: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000006c8 13131 1726867207.84175: variable 'ansible_search_path' from source: unknown 13131 1726867207.84181: variable 'ansible_search_path' from source: unknown 13131 1726867207.84209: calling self._execute() 13131 1726867207.84280: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.84284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.84292: variable 'omit' from source: magic vars 13131 1726867207.84547: variable 'ansible_distribution_major_version' from source: facts 13131 1726867207.84557: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867207.84669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867207.84855: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867207.84888: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867207.84916: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867207.84941: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867207.85000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867207.85021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867207.85038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867207.85055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867207.85120: variable '__network_is_ostree' from source: set_fact 13131 1726867207.85124: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867207.85127: when evaluation is False, skipping this task 13131 1726867207.85130: _execute() done 13131 1726867207.85133: dumping result to json 13131 1726867207.85137: done dumping result, returning 13131 1726867207.85144: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-5f24-9b7a-0000000006c8] 13131 1726867207.85146: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006c8 13131 1726867207.85234: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006c8 13131 1726867207.85237: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867207.85349: no more pending results, returning what we have 13131 1726867207.85352: results queue empty 13131 1726867207.85353: checking for any_errors_fatal 13131 1726867207.85357: done checking for any_errors_fatal 13131 1726867207.85357: checking for max_fail_percentage 13131 1726867207.85359: done checking for max_fail_percentage 13131 1726867207.85359: checking to see if all hosts have failed and the running result is not ok 13131 1726867207.85360: done checking to see if all hosts have failed 13131 1726867207.85361: getting the remaining hosts for this loop 13131 1726867207.85362: done getting the remaining hosts for this loop 13131 1726867207.85365: getting the next task for host managed_node1 13131 1726867207.85372: done getting next task for host managed_node1 13131 1726867207.85375: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867207.85381: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867207.85397: getting variables 13131 1726867207.85398: in VariableManager get_vars() 13131 1726867207.85442: Calling all_inventory to load vars for managed_node1 13131 1726867207.85444: Calling groups_inventory to load vars for managed_node1 13131 1726867207.85446: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867207.85454: Calling all_plugins_play to load vars for managed_node1 13131 1726867207.85457: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867207.85459: Calling groups_plugins_play to load vars for managed_node1 13131 1726867207.86766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867207.88408: done with get_vars() 13131 1726867207.88434: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:20:07 -0400 (0:00:00.048) 0:00:22.995 ****** 13131 1726867207.88527: entering _queue_task() for managed_node1/service_facts 13131 1726867207.88813: worker is 1 (out of 1 available) 13131 1726867207.88827: exiting _queue_task() for managed_node1/service_facts 13131 1726867207.88839: done queuing things up, now waiting for results queue to drain 13131 1726867207.88841: waiting for pending results... 13131 1726867207.89303: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867207.89321: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000006ca 13131 1726867207.89418: variable 'ansible_search_path' from source: unknown 13131 1726867207.89423: variable 'ansible_search_path' from source: unknown 13131 1726867207.89426: calling self._execute() 13131 1726867207.89525: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.89530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.89533: variable 'omit' from source: magic vars 13131 1726867207.89904: variable 'ansible_distribution_major_version' from source: facts 13131 1726867207.89915: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867207.89921: variable 'omit' from source: magic vars 13131 1726867207.89972: variable 'omit' from source: magic vars 13131 1726867207.89997: variable 'omit' from source: magic vars 13131 1726867207.90026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867207.90052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867207.90069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867207.90088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867207.90097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867207.90122: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867207.90125: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.90127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.90195: Set connection var ansible_connection to ssh 13131 1726867207.90202: Set connection var ansible_timeout to 10 13131 1726867207.90207: Set connection var ansible_shell_type to sh 13131 1726867207.90214: Set connection var ansible_shell_executable to /bin/sh 13131 1726867207.90222: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867207.90227: Set connection var ansible_pipelining to False 13131 1726867207.90243: variable 'ansible_shell_executable' from source: unknown 13131 1726867207.90246: variable 'ansible_connection' from source: unknown 13131 1726867207.90249: variable 'ansible_module_compression' from source: unknown 13131 1726867207.90251: variable 'ansible_shell_type' from source: unknown 13131 1726867207.90253: variable 'ansible_shell_executable' from source: unknown 13131 1726867207.90255: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867207.90258: variable 'ansible_pipelining' from source: unknown 13131 1726867207.90261: variable 'ansible_timeout' from source: unknown 13131 1726867207.90265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867207.90408: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867207.90417: variable 'omit' from source: magic vars 13131 1726867207.90423: starting attempt loop 13131 1726867207.90425: running the handler 13131 1726867207.90437: _low_level_execute_command(): starting 13131 1726867207.90444: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867207.90944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.90949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867207.90952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867207.90954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.90997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.91003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.91006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.91061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.92738: stdout chunk (state=3): >>>/root <<< 13131 1726867207.92839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.92864: stderr chunk (state=3): >>><<< 13131 1726867207.92869: stdout chunk (state=3): >>><<< 13131 1726867207.92891: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.92902: _low_level_execute_command(): starting 13131 1726867207.92910: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260 `" && echo ansible-tmp-1726867207.9289017-14353-105328684781260="` echo /root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260 `" ) && sleep 0' 13131 1726867207.93338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867207.93341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.93351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867207.93353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.93387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.93405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.93447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.95321: stdout chunk (state=3): >>>ansible-tmp-1726867207.9289017-14353-105328684781260=/root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260 <<< 13131 1726867207.95430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.95469: stderr chunk (state=3): >>><<< 13131 1726867207.95473: stdout chunk (state=3): >>><<< 13131 1726867207.95682: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867207.9289017-14353-105328684781260=/root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867207.95686: variable 'ansible_module_compression' from source: unknown 13131 1726867207.95688: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13131 1726867207.95690: variable 'ansible_facts' from source: unknown 13131 1726867207.95716: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/AnsiballZ_service_facts.py 13131 1726867207.95931: Sending initial data 13131 1726867207.95941: Sent initial data (162 bytes) 13131 1726867207.96437: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867207.96454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867207.96471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867207.96497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867207.96584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867207.96612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867207.96632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867207.96645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867207.96711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867207.98253: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867207.98302: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867207.98353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp8ankhor4 /root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/AnsiballZ_service_facts.py <<< 13131 1726867207.98357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/AnsiballZ_service_facts.py" <<< 13131 1726867207.98396: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp8ankhor4" to remote "/root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/AnsiballZ_service_facts.py" <<< 13131 1726867207.98399: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/AnsiballZ_service_facts.py" <<< 13131 1726867207.99472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867207.99475: stdout chunk (state=3): >>><<< 13131 1726867207.99480: stderr chunk (state=3): >>><<< 13131 1726867207.99483: done transferring module to remote 13131 1726867207.99485: _low_level_execute_command(): starting 13131 1726867207.99488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/ /root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/AnsiballZ_service_facts.py && sleep 0' 13131 1726867208.00082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867208.00086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867208.00089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867208.00141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867208.00156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867208.00166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867208.00239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867208.02137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867208.02140: stdout chunk (state=3): >>><<< 13131 1726867208.02143: stderr chunk (state=3): >>><<< 13131 1726867208.02146: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867208.02148: _low_level_execute_command(): starting 13131 1726867208.02150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/AnsiballZ_service_facts.py && sleep 0' 13131 1726867208.02690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867208.02708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867208.02724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867208.02742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867208.02760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867208.02773: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867208.02872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867208.02918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867208.02969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867209.54951: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 13131 1726867209.54964: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13131 1726867209.56736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867209.56740: stdout chunk (state=3): >>><<< 13131 1726867209.56743: stderr chunk (state=3): >>><<< 13131 1726867209.56747: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867209.59185: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867209.59190: _low_level_execute_command(): starting 13131 1726867209.59193: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867207.9289017-14353-105328684781260/ > /dev/null 2>&1 && sleep 0' 13131 1726867209.59751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867209.59851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867209.59854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867209.59856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867209.59858: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867209.59860: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867209.59862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.59893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.59930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867209.59943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867209.59957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867209.60108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867209.61987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867209.61990: stdout chunk (state=3): >>><<< 13131 1726867209.61999: stderr chunk (state=3): >>><<< 13131 1726867209.62018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867209.62184: handler run complete 13131 1726867209.62435: variable 'ansible_facts' from source: unknown 13131 1726867209.62794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867209.63860: variable 'ansible_facts' from source: unknown 13131 1726867209.64205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867209.64837: attempt loop complete, returning result 13131 1726867209.64843: _execute() done 13131 1726867209.64847: dumping result to json 13131 1726867209.65000: done dumping result, returning 13131 1726867209.65120: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-5f24-9b7a-0000000006ca] 13131 1726867209.65123: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006ca 13131 1726867209.67021: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006ca 13131 1726867209.67024: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867209.67130: no more pending results, returning what we have 13131 1726867209.67133: results queue empty 13131 1726867209.67138: checking for any_errors_fatal 13131 1726867209.67141: done checking for any_errors_fatal 13131 1726867209.67142: checking for max_fail_percentage 13131 1726867209.67143: done checking for max_fail_percentage 13131 1726867209.67144: checking to see if all hosts have failed and the running result is not ok 13131 1726867209.67145: done checking to see if all hosts have failed 13131 1726867209.67145: getting the remaining hosts for this loop 13131 1726867209.67146: done getting the remaining hosts for this loop 13131 1726867209.67149: getting the next task for host managed_node1 13131 1726867209.67154: done getting next task for host managed_node1 13131 1726867209.67158: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867209.67161: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867209.67171: getting variables 13131 1726867209.67172: in VariableManager get_vars() 13131 1726867209.67215: Calling all_inventory to load vars for managed_node1 13131 1726867209.67217: Calling groups_inventory to load vars for managed_node1 13131 1726867209.67220: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867209.67228: Calling all_plugins_play to load vars for managed_node1 13131 1726867209.67230: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867209.67233: Calling groups_plugins_play to load vars for managed_node1 13131 1726867209.68680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867209.70438: done with get_vars() 13131 1726867209.70460: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:20:09 -0400 (0:00:01.820) 0:00:24.816 ****** 13131 1726867209.70570: entering _queue_task() for managed_node1/package_facts 13131 1726867209.70952: worker is 1 (out of 1 available) 13131 1726867209.70964: exiting _queue_task() for managed_node1/package_facts 13131 1726867209.70976: done queuing things up, now waiting for results queue to drain 13131 1726867209.70980: waiting for pending results... 13131 1726867209.71281: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867209.71407: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000006cb 13131 1726867209.71442: variable 'ansible_search_path' from source: unknown 13131 1726867209.71451: variable 'ansible_search_path' from source: unknown 13131 1726867209.71540: calling self._execute() 13131 1726867209.71984: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867209.71989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867209.71993: variable 'omit' from source: magic vars 13131 1726867209.72506: variable 'ansible_distribution_major_version' from source: facts 13131 1726867209.72561: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867209.72665: variable 'omit' from source: magic vars 13131 1726867209.72752: variable 'omit' from source: magic vars 13131 1726867209.72818: variable 'omit' from source: magic vars 13131 1726867209.73093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867209.73097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867209.73099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867209.73101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867209.73195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867209.73234: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867209.73244: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867209.73253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867209.73502: Set connection var ansible_connection to ssh 13131 1726867209.73529: Set connection var ansible_timeout to 10 13131 1726867209.73531: Set connection var ansible_shell_type to sh 13131 1726867209.73535: Set connection var ansible_shell_executable to /bin/sh 13131 1726867209.73547: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867209.73555: Set connection var ansible_pipelining to False 13131 1726867209.73580: variable 'ansible_shell_executable' from source: unknown 13131 1726867209.73682: variable 'ansible_connection' from source: unknown 13131 1726867209.73686: variable 'ansible_module_compression' from source: unknown 13131 1726867209.73688: variable 'ansible_shell_type' from source: unknown 13131 1726867209.73689: variable 'ansible_shell_executable' from source: unknown 13131 1726867209.73691: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867209.73693: variable 'ansible_pipelining' from source: unknown 13131 1726867209.73694: variable 'ansible_timeout' from source: unknown 13131 1726867209.73696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867209.74019: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867209.74086: variable 'omit' from source: magic vars 13131 1726867209.74094: starting attempt loop 13131 1726867209.74101: running the handler 13131 1726867209.74291: _low_level_execute_command(): starting 13131 1726867209.74294: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867209.75706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.75805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867209.75833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867209.75924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867209.75992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867209.77604: stdout chunk (state=3): >>>/root <<< 13131 1726867209.77711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867209.77748: stderr chunk (state=3): >>><<< 13131 1726867209.77796: stdout chunk (state=3): >>><<< 13131 1726867209.77987: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867209.77990: _low_level_execute_command(): starting 13131 1726867209.77994: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580 `" && echo ansible-tmp-1726867209.7781746-14414-42049732499580="` echo /root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580 `" ) && sleep 0' 13131 1726867209.79093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867209.79105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.79197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.79311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867209.79327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867209.79413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867209.79573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867209.81407: stdout chunk (state=3): >>>ansible-tmp-1726867209.7781746-14414-42049732499580=/root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580 <<< 13131 1726867209.81509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867209.81538: stderr chunk (state=3): >>><<< 13131 1726867209.81784: stdout chunk (state=3): >>><<< 13131 1726867209.81788: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867209.7781746-14414-42049732499580=/root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867209.81791: variable 'ansible_module_compression' from source: unknown 13131 1726867209.81793: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13131 1726867209.82002: variable 'ansible_facts' from source: unknown 13131 1726867209.82386: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/AnsiballZ_package_facts.py 13131 1726867209.82967: Sending initial data 13131 1726867209.82970: Sent initial data (161 bytes) 13131 1726867209.84051: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867209.84064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.84074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.84258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867209.84261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867209.84270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867209.84352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867209.85980: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867209.86014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867209.86083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp1rdtk43o /root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/AnsiballZ_package_facts.py <<< 13131 1726867209.86087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/AnsiballZ_package_facts.py" <<< 13131 1726867209.86249: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp1rdtk43o" to remote "/root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/AnsiballZ_package_facts.py" <<< 13131 1726867209.89239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867209.89242: stderr chunk (state=3): >>><<< 13131 1726867209.89246: stdout chunk (state=3): >>><<< 13131 1726867209.89295: done transferring module to remote 13131 1726867209.89306: _low_level_execute_command(): starting 13131 1726867209.89314: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/ /root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/AnsiballZ_package_facts.py && sleep 0' 13131 1726867209.90592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.90694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867209.90714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867209.90731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867209.90809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867209.92597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867209.92772: stderr chunk (state=3): >>><<< 13131 1726867209.92775: stdout chunk (state=3): >>><<< 13131 1726867209.92781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867209.92784: _low_level_execute_command(): starting 13131 1726867209.92787: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/AnsiballZ_package_facts.py && sleep 0' 13131 1726867209.93513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867209.93517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867209.93520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867209.93522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867209.93525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867209.93527: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867209.93529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.93531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867209.93533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867209.93535: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867209.93537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867209.93539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867209.93541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867209.93543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867209.93545: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867209.93546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867209.93548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867209.93550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867209.93553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867209.93699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867210.37915: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 13131 1726867210.37982: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 13131 1726867210.38052: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 13131 1726867210.38114: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 13131 1726867210.38145: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13131 1726867210.39981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867210.39985: stderr chunk (state=3): >>><<< 13131 1726867210.39994: stdout chunk (state=3): >>><<< 13131 1726867210.40194: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867210.42121: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867210.42136: _low_level_execute_command(): starting 13131 1726867210.42141: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867209.7781746-14414-42049732499580/ > /dev/null 2>&1 && sleep 0' 13131 1726867210.42546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867210.42578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867210.42582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867210.42584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867210.42586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867210.42588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867210.42590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867210.42638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867210.42641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867210.42697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867210.44506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867210.44535: stderr chunk (state=3): >>><<< 13131 1726867210.44537: stdout chunk (state=3): >>><<< 13131 1726867210.44547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867210.44562: handler run complete 13131 1726867210.44994: variable 'ansible_facts' from source: unknown 13131 1726867210.45304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.51215: variable 'ansible_facts' from source: unknown 13131 1726867210.51503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.52136: attempt loop complete, returning result 13131 1726867210.52146: _execute() done 13131 1726867210.52148: dumping result to json 13131 1726867210.52268: done dumping result, returning 13131 1726867210.52274: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-5f24-9b7a-0000000006cb] 13131 1726867210.52279: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006cb 13131 1726867210.53602: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000006cb 13131 1726867210.53605: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867210.53695: no more pending results, returning what we have 13131 1726867210.53697: results queue empty 13131 1726867210.53697: checking for any_errors_fatal 13131 1726867210.53701: done checking for any_errors_fatal 13131 1726867210.53702: checking for max_fail_percentage 13131 1726867210.53703: done checking for max_fail_percentage 13131 1726867210.53703: checking to see if all hosts have failed and the running result is not ok 13131 1726867210.53704: done checking to see if all hosts have failed 13131 1726867210.53704: getting the remaining hosts for this loop 13131 1726867210.53705: done getting the remaining hosts for this loop 13131 1726867210.53708: getting the next task for host managed_node1 13131 1726867210.53712: done getting next task for host managed_node1 13131 1726867210.53715: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867210.53717: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867210.53725: getting variables 13131 1726867210.53726: in VariableManager get_vars() 13131 1726867210.53756: Calling all_inventory to load vars for managed_node1 13131 1726867210.53758: Calling groups_inventory to load vars for managed_node1 13131 1726867210.53759: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867210.53765: Calling all_plugins_play to load vars for managed_node1 13131 1726867210.53767: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867210.53768: Calling groups_plugins_play to load vars for managed_node1 13131 1726867210.57836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.58679: done with get_vars() 13131 1726867210.58694: done getting variables 13131 1726867210.58729: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:20:10 -0400 (0:00:00.881) 0:00:25.698 ****** 13131 1726867210.58753: entering _queue_task() for managed_node1/debug 13131 1726867210.59015: worker is 1 (out of 1 available) 13131 1726867210.59029: exiting _queue_task() for managed_node1/debug 13131 1726867210.59039: done queuing things up, now waiting for results queue to drain 13131 1726867210.59041: waiting for pending results... 13131 1726867210.59223: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867210.59324: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000007c 13131 1726867210.59335: variable 'ansible_search_path' from source: unknown 13131 1726867210.59339: variable 'ansible_search_path' from source: unknown 13131 1726867210.59368: calling self._execute() 13131 1726867210.59444: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867210.59448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867210.59459: variable 'omit' from source: magic vars 13131 1726867210.59737: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.59746: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867210.59754: variable 'omit' from source: magic vars 13131 1726867210.59795: variable 'omit' from source: magic vars 13131 1726867210.59868: variable 'network_provider' from source: set_fact 13131 1726867210.59883: variable 'omit' from source: magic vars 13131 1726867210.59925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867210.59946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867210.59961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867210.59975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867210.59986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867210.60012: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867210.60015: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867210.60018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867210.60085: Set connection var ansible_connection to ssh 13131 1726867210.60092: Set connection var ansible_timeout to 10 13131 1726867210.60095: Set connection var ansible_shell_type to sh 13131 1726867210.60104: Set connection var ansible_shell_executable to /bin/sh 13131 1726867210.60112: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867210.60117: Set connection var ansible_pipelining to False 13131 1726867210.60134: variable 'ansible_shell_executable' from source: unknown 13131 1726867210.60143: variable 'ansible_connection' from source: unknown 13131 1726867210.60147: variable 'ansible_module_compression' from source: unknown 13131 1726867210.60149: variable 'ansible_shell_type' from source: unknown 13131 1726867210.60152: variable 'ansible_shell_executable' from source: unknown 13131 1726867210.60154: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867210.60156: variable 'ansible_pipelining' from source: unknown 13131 1726867210.60159: variable 'ansible_timeout' from source: unknown 13131 1726867210.60163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867210.60263: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867210.60272: variable 'omit' from source: magic vars 13131 1726867210.60279: starting attempt loop 13131 1726867210.60282: running the handler 13131 1726867210.60318: handler run complete 13131 1726867210.60327: attempt loop complete, returning result 13131 1726867210.60330: _execute() done 13131 1726867210.60333: dumping result to json 13131 1726867210.60335: done dumping result, returning 13131 1726867210.60342: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-5f24-9b7a-00000000007c] 13131 1726867210.60346: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007c 13131 1726867210.60423: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007c 13131 1726867210.60425: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 13131 1726867210.60519: no more pending results, returning what we have 13131 1726867210.60522: results queue empty 13131 1726867210.60522: checking for any_errors_fatal 13131 1726867210.60530: done checking for any_errors_fatal 13131 1726867210.60530: checking for max_fail_percentage 13131 1726867210.60532: done checking for max_fail_percentage 13131 1726867210.60533: checking to see if all hosts have failed and the running result is not ok 13131 1726867210.60534: done checking to see if all hosts have failed 13131 1726867210.60534: getting the remaining hosts for this loop 13131 1726867210.60536: done getting the remaining hosts for this loop 13131 1726867210.60539: getting the next task for host managed_node1 13131 1726867210.60544: done getting next task for host managed_node1 13131 1726867210.60547: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867210.60549: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867210.60559: getting variables 13131 1726867210.60560: in VariableManager get_vars() 13131 1726867210.60603: Calling all_inventory to load vars for managed_node1 13131 1726867210.60606: Calling groups_inventory to load vars for managed_node1 13131 1726867210.60607: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867210.60615: Calling all_plugins_play to load vars for managed_node1 13131 1726867210.60618: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867210.60620: Calling groups_plugins_play to load vars for managed_node1 13131 1726867210.61388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.62329: done with get_vars() 13131 1726867210.62343: done getting variables 13131 1726867210.62381: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:20:10 -0400 (0:00:00.036) 0:00:25.734 ****** 13131 1726867210.62404: entering _queue_task() for managed_node1/fail 13131 1726867210.62613: worker is 1 (out of 1 available) 13131 1726867210.62626: exiting _queue_task() for managed_node1/fail 13131 1726867210.62639: done queuing things up, now waiting for results queue to drain 13131 1726867210.62640: waiting for pending results... 13131 1726867210.62807: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867210.62897: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000007d 13131 1726867210.62910: variable 'ansible_search_path' from source: unknown 13131 1726867210.62913: variable 'ansible_search_path' from source: unknown 13131 1726867210.62941: calling self._execute() 13131 1726867210.63010: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867210.63015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867210.63024: variable 'omit' from source: magic vars 13131 1726867210.63381: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.63400: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867210.63521: variable 'network_state' from source: role '' defaults 13131 1726867210.63537: Evaluated conditional (network_state != {}): False 13131 1726867210.63545: when evaluation is False, skipping this task 13131 1726867210.63552: _execute() done 13131 1726867210.63559: dumping result to json 13131 1726867210.63567: done dumping result, returning 13131 1726867210.63581: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-5f24-9b7a-00000000007d] 13131 1726867210.63602: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007d 13131 1726867210.63788: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007d 13131 1726867210.63792: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867210.63841: no more pending results, returning what we have 13131 1726867210.63845: results queue empty 13131 1726867210.63846: checking for any_errors_fatal 13131 1726867210.63850: done checking for any_errors_fatal 13131 1726867210.63851: checking for max_fail_percentage 13131 1726867210.63853: done checking for max_fail_percentage 13131 1726867210.63854: checking to see if all hosts have failed and the running result is not ok 13131 1726867210.63854: done checking to see if all hosts have failed 13131 1726867210.63855: getting the remaining hosts for this loop 13131 1726867210.63856: done getting the remaining hosts for this loop 13131 1726867210.63859: getting the next task for host managed_node1 13131 1726867210.63866: done getting next task for host managed_node1 13131 1726867210.63870: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867210.63874: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867210.63894: getting variables 13131 1726867210.63896: in VariableManager get_vars() 13131 1726867210.63946: Calling all_inventory to load vars for managed_node1 13131 1726867210.63949: Calling groups_inventory to load vars for managed_node1 13131 1726867210.63951: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867210.63963: Calling all_plugins_play to load vars for managed_node1 13131 1726867210.63966: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867210.63969: Calling groups_plugins_play to load vars for managed_node1 13131 1726867210.64955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.65817: done with get_vars() 13131 1726867210.65832: done getting variables 13131 1726867210.65870: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:20:10 -0400 (0:00:00.034) 0:00:25.769 ****** 13131 1726867210.65892: entering _queue_task() for managed_node1/fail 13131 1726867210.66074: worker is 1 (out of 1 available) 13131 1726867210.66089: exiting _queue_task() for managed_node1/fail 13131 1726867210.66100: done queuing things up, now waiting for results queue to drain 13131 1726867210.66101: waiting for pending results... 13131 1726867210.66331: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867210.66491: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000007e 13131 1726867210.66501: variable 'ansible_search_path' from source: unknown 13131 1726867210.66504: variable 'ansible_search_path' from source: unknown 13131 1726867210.66683: calling self._execute() 13131 1726867210.66688: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867210.66691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867210.66694: variable 'omit' from source: magic vars 13131 1726867210.67087: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.67091: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867210.67283: variable 'network_state' from source: role '' defaults 13131 1726867210.67286: Evaluated conditional (network_state != {}): False 13131 1726867210.67288: when evaluation is False, skipping this task 13131 1726867210.67291: _execute() done 13131 1726867210.67292: dumping result to json 13131 1726867210.67295: done dumping result, returning 13131 1726867210.67297: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-5f24-9b7a-00000000007e] 13131 1726867210.67299: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007e 13131 1726867210.67354: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007e 13131 1726867210.67357: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867210.67392: no more pending results, returning what we have 13131 1726867210.67395: results queue empty 13131 1726867210.67396: checking for any_errors_fatal 13131 1726867210.67400: done checking for any_errors_fatal 13131 1726867210.67401: checking for max_fail_percentage 13131 1726867210.67403: done checking for max_fail_percentage 13131 1726867210.67403: checking to see if all hosts have failed and the running result is not ok 13131 1726867210.67404: done checking to see if all hosts have failed 13131 1726867210.67405: getting the remaining hosts for this loop 13131 1726867210.67406: done getting the remaining hosts for this loop 13131 1726867210.67409: getting the next task for host managed_node1 13131 1726867210.67414: done getting next task for host managed_node1 13131 1726867210.67417: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867210.67420: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867210.67435: getting variables 13131 1726867210.67436: in VariableManager get_vars() 13131 1726867210.67481: Calling all_inventory to load vars for managed_node1 13131 1726867210.67483: Calling groups_inventory to load vars for managed_node1 13131 1726867210.67485: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867210.67493: Calling all_plugins_play to load vars for managed_node1 13131 1726867210.67495: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867210.67498: Calling groups_plugins_play to load vars for managed_node1 13131 1726867210.68894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.70495: done with get_vars() 13131 1726867210.70518: done getting variables 13131 1726867210.70571: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:20:10 -0400 (0:00:00.047) 0:00:25.816 ****** 13131 1726867210.70608: entering _queue_task() for managed_node1/fail 13131 1726867210.71102: worker is 1 (out of 1 available) 13131 1726867210.71115: exiting _queue_task() for managed_node1/fail 13131 1726867210.71126: done queuing things up, now waiting for results queue to drain 13131 1726867210.71127: waiting for pending results... 13131 1726867210.71248: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867210.71358: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000007f 13131 1726867210.71382: variable 'ansible_search_path' from source: unknown 13131 1726867210.71385: variable 'ansible_search_path' from source: unknown 13131 1726867210.71423: calling self._execute() 13131 1726867210.71526: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867210.71532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867210.71583: variable 'omit' from source: magic vars 13131 1726867210.71953: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.71965: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867210.72154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867210.74395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867210.74529: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867210.74533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867210.74543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867210.74566: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867210.74647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.74676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.74703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.74751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.74855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.74861: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.74879: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13131 1726867210.74999: variable 'ansible_distribution' from source: facts 13131 1726867210.75003: variable '__network_rh_distros' from source: role '' defaults 13131 1726867210.75016: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13131 1726867210.75276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.75313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.75339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.75382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.75394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.75446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.75469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.75492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.75537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.75550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.75592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.75625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.75649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.75686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.75702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.76046: variable 'network_connections' from source: task vars 13131 1726867210.76092: variable 'controller_profile' from source: play vars 13131 1726867210.76122: variable 'controller_profile' from source: play vars 13131 1726867210.76132: variable 'network_state' from source: role '' defaults 13131 1726867210.76209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867210.76582: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867210.76585: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867210.76588: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867210.76592: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867210.76595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867210.76598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867210.76610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.76633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867210.76655: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13131 1726867210.76659: when evaluation is False, skipping this task 13131 1726867210.76662: _execute() done 13131 1726867210.76664: dumping result to json 13131 1726867210.76666: done dumping result, returning 13131 1726867210.76675: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-5f24-9b7a-00000000007f] 13131 1726867210.76680: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007f 13131 1726867210.76779: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000007f 13131 1726867210.76783: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13131 1726867210.76837: no more pending results, returning what we have 13131 1726867210.76841: results queue empty 13131 1726867210.76842: checking for any_errors_fatal 13131 1726867210.76849: done checking for any_errors_fatal 13131 1726867210.76849: checking for max_fail_percentage 13131 1726867210.76851: done checking for max_fail_percentage 13131 1726867210.76852: checking to see if all hosts have failed and the running result is not ok 13131 1726867210.76853: done checking to see if all hosts have failed 13131 1726867210.76853: getting the remaining hosts for this loop 13131 1726867210.76855: done getting the remaining hosts for this loop 13131 1726867210.76859: getting the next task for host managed_node1 13131 1726867210.76867: done getting next task for host managed_node1 13131 1726867210.76871: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867210.76874: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867210.76896: getting variables 13131 1726867210.76898: in VariableManager get_vars() 13131 1726867210.76958: Calling all_inventory to load vars for managed_node1 13131 1726867210.76961: Calling groups_inventory to load vars for managed_node1 13131 1726867210.76964: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867210.76975: Calling all_plugins_play to load vars for managed_node1 13131 1726867210.77188: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867210.77193: Calling groups_plugins_play to load vars for managed_node1 13131 1726867210.78656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.80254: done with get_vars() 13131 1726867210.80274: done getting variables 13131 1726867210.80341: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:20:10 -0400 (0:00:00.097) 0:00:25.914 ****** 13131 1726867210.80372: entering _queue_task() for managed_node1/dnf 13131 1726867210.80785: worker is 1 (out of 1 available) 13131 1726867210.80796: exiting _queue_task() for managed_node1/dnf 13131 1726867210.80807: done queuing things up, now waiting for results queue to drain 13131 1726867210.80809: waiting for pending results... 13131 1726867210.81098: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867210.81149: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000080 13131 1726867210.81167: variable 'ansible_search_path' from source: unknown 13131 1726867210.81174: variable 'ansible_search_path' from source: unknown 13131 1726867210.81283: calling self._execute() 13131 1726867210.81331: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867210.81343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867210.81357: variable 'omit' from source: magic vars 13131 1726867210.81756: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.81771: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867210.81988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867210.84621: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867210.84696: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867210.84751: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867210.84883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867210.84888: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867210.84920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.84953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.84983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.85038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.85056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.85174: variable 'ansible_distribution' from source: facts 13131 1726867210.85189: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.85223: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13131 1726867210.85441: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867210.85495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.85528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.85564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.85611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.85630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.85684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.85715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.85742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.85792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.85813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.85854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.85890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.85923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.85964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.85989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.86183: variable 'network_connections' from source: task vars 13131 1726867210.86186: variable 'controller_profile' from source: play vars 13131 1726867210.86242: variable 'controller_profile' from source: play vars 13131 1726867210.86327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867210.86502: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867210.86549: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867210.86636: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867210.86639: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867210.86681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867210.86711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867210.86755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.86786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867210.86836: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867210.87095: variable 'network_connections' from source: task vars 13131 1726867210.87283: variable 'controller_profile' from source: play vars 13131 1726867210.87286: variable 'controller_profile' from source: play vars 13131 1726867210.87288: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867210.87290: when evaluation is False, skipping this task 13131 1726867210.87292: _execute() done 13131 1726867210.87293: dumping result to json 13131 1726867210.87295: done dumping result, returning 13131 1726867210.87297: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000080] 13131 1726867210.87299: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000080 13131 1726867210.87366: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000080 13131 1726867210.87369: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867210.87428: no more pending results, returning what we have 13131 1726867210.87432: results queue empty 13131 1726867210.87433: checking for any_errors_fatal 13131 1726867210.87439: done checking for any_errors_fatal 13131 1726867210.87440: checking for max_fail_percentage 13131 1726867210.87442: done checking for max_fail_percentage 13131 1726867210.87443: checking to see if all hosts have failed and the running result is not ok 13131 1726867210.87444: done checking to see if all hosts have failed 13131 1726867210.87444: getting the remaining hosts for this loop 13131 1726867210.87446: done getting the remaining hosts for this loop 13131 1726867210.87451: getting the next task for host managed_node1 13131 1726867210.87459: done getting next task for host managed_node1 13131 1726867210.87464: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867210.87467: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867210.87488: getting variables 13131 1726867210.87490: in VariableManager get_vars() 13131 1726867210.87552: Calling all_inventory to load vars for managed_node1 13131 1726867210.87556: Calling groups_inventory to load vars for managed_node1 13131 1726867210.87558: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867210.87569: Calling all_plugins_play to load vars for managed_node1 13131 1726867210.87573: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867210.87576: Calling groups_plugins_play to load vars for managed_node1 13131 1726867210.89495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.91166: done with get_vars() 13131 1726867210.91193: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867210.91275: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:20:10 -0400 (0:00:00.109) 0:00:26.023 ****** 13131 1726867210.91310: entering _queue_task() for managed_node1/yum 13131 1726867210.91671: worker is 1 (out of 1 available) 13131 1726867210.91686: exiting _queue_task() for managed_node1/yum 13131 1726867210.91697: done queuing things up, now waiting for results queue to drain 13131 1726867210.91699: waiting for pending results... 13131 1726867210.92096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867210.92166: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000081 13131 1726867210.92191: variable 'ansible_search_path' from source: unknown 13131 1726867210.92204: variable 'ansible_search_path' from source: unknown 13131 1726867210.92256: calling self._execute() 13131 1726867210.92583: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867210.92587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867210.92591: variable 'omit' from source: magic vars 13131 1726867210.92783: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.92805: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867210.93010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867210.95365: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867210.95450: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867210.95491: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867210.95529: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867210.95564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867210.95640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867210.95684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867210.95720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867210.95771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867210.95795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867210.95915: variable 'ansible_distribution_major_version' from source: facts 13131 1726867210.95938: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13131 1726867210.95947: when evaluation is False, skipping this task 13131 1726867210.95954: _execute() done 13131 1726867210.95961: dumping result to json 13131 1726867210.95983: done dumping result, returning 13131 1726867210.95990: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000081] 13131 1726867210.96086: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000081 13131 1726867210.96165: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000081 13131 1726867210.96168: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13131 1726867210.96227: no more pending results, returning what we have 13131 1726867210.96230: results queue empty 13131 1726867210.96231: checking for any_errors_fatal 13131 1726867210.96239: done checking for any_errors_fatal 13131 1726867210.96240: checking for max_fail_percentage 13131 1726867210.96242: done checking for max_fail_percentage 13131 1726867210.96242: checking to see if all hosts have failed and the running result is not ok 13131 1726867210.96243: done checking to see if all hosts have failed 13131 1726867210.96244: getting the remaining hosts for this loop 13131 1726867210.96245: done getting the remaining hosts for this loop 13131 1726867210.96249: getting the next task for host managed_node1 13131 1726867210.96257: done getting next task for host managed_node1 13131 1726867210.96262: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867210.96265: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867210.96285: getting variables 13131 1726867210.96287: in VariableManager get_vars() 13131 1726867210.96348: Calling all_inventory to load vars for managed_node1 13131 1726867210.96351: Calling groups_inventory to load vars for managed_node1 13131 1726867210.96354: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867210.96365: Calling all_plugins_play to load vars for managed_node1 13131 1726867210.96368: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867210.96371: Calling groups_plugins_play to load vars for managed_node1 13131 1726867210.98070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867210.99858: done with get_vars() 13131 1726867210.99881: done getting variables 13131 1726867210.99939: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:20:10 -0400 (0:00:00.086) 0:00:26.110 ****** 13131 1726867210.99979: entering _queue_task() for managed_node1/fail 13131 1726867211.00391: worker is 1 (out of 1 available) 13131 1726867211.00404: exiting _queue_task() for managed_node1/fail 13131 1726867211.00413: done queuing things up, now waiting for results queue to drain 13131 1726867211.00415: waiting for pending results... 13131 1726867211.01096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867211.01373: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000082 13131 1726867211.01378: variable 'ansible_search_path' from source: unknown 13131 1726867211.01381: variable 'ansible_search_path' from source: unknown 13131 1726867211.01384: calling self._execute() 13131 1726867211.01535: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867211.01604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867211.01622: variable 'omit' from source: magic vars 13131 1726867211.02461: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.02480: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867211.02817: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867211.03332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867211.07789: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867211.07983: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867211.08083: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867211.08178: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867211.08215: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867211.08417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.08609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.08613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.08696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.08723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.08825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.08913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.08946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.09031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.09121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.09257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.09263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.09293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.09396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.09417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.09608: variable 'network_connections' from source: task vars 13131 1726867211.09628: variable 'controller_profile' from source: play vars 13131 1726867211.09833: variable 'controller_profile' from source: play vars 13131 1726867211.09924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867211.10112: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867211.10167: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867211.10210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867211.10248: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867211.10295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867211.10343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867211.10361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.10393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867211.10483: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867211.10718: variable 'network_connections' from source: task vars 13131 1726867211.10728: variable 'controller_profile' from source: play vars 13131 1726867211.10802: variable 'controller_profile' from source: play vars 13131 1726867211.10832: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867211.10852: when evaluation is False, skipping this task 13131 1726867211.10855: _execute() done 13131 1726867211.10857: dumping result to json 13131 1726867211.10883: done dumping result, returning 13131 1726867211.10886: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000082] 13131 1726867211.10888: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000082 13131 1726867211.11132: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000082 13131 1726867211.11135: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867211.11192: no more pending results, returning what we have 13131 1726867211.11196: results queue empty 13131 1726867211.11197: checking for any_errors_fatal 13131 1726867211.11206: done checking for any_errors_fatal 13131 1726867211.11207: checking for max_fail_percentage 13131 1726867211.11209: done checking for max_fail_percentage 13131 1726867211.11210: checking to see if all hosts have failed and the running result is not ok 13131 1726867211.11210: done checking to see if all hosts have failed 13131 1726867211.11211: getting the remaining hosts for this loop 13131 1726867211.11212: done getting the remaining hosts for this loop 13131 1726867211.11216: getting the next task for host managed_node1 13131 1726867211.11223: done getting next task for host managed_node1 13131 1726867211.11227: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13131 1726867211.11232: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867211.11487: getting variables 13131 1726867211.11489: in VariableManager get_vars() 13131 1726867211.11539: Calling all_inventory to load vars for managed_node1 13131 1726867211.11542: Calling groups_inventory to load vars for managed_node1 13131 1726867211.11544: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867211.11552: Calling all_plugins_play to load vars for managed_node1 13131 1726867211.11555: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867211.11558: Calling groups_plugins_play to load vars for managed_node1 13131 1726867211.12541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867211.13424: done with get_vars() 13131 1726867211.13439: done getting variables 13131 1726867211.13481: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:20:11 -0400 (0:00:00.135) 0:00:26.245 ****** 13131 1726867211.13508: entering _queue_task() for managed_node1/package 13131 1726867211.13746: worker is 1 (out of 1 available) 13131 1726867211.13763: exiting _queue_task() for managed_node1/package 13131 1726867211.13781: done queuing things up, now waiting for results queue to drain 13131 1726867211.13783: waiting for pending results... 13131 1726867211.14098: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13131 1726867211.14227: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000083 13131 1726867211.14283: variable 'ansible_search_path' from source: unknown 13131 1726867211.14287: variable 'ansible_search_path' from source: unknown 13131 1726867211.14304: calling self._execute() 13131 1726867211.14416: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867211.14435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867211.14451: variable 'omit' from source: magic vars 13131 1726867211.14896: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.14913: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867211.15046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867211.15239: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867211.15273: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867211.15304: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867211.15353: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867211.15435: variable 'network_packages' from source: role '' defaults 13131 1726867211.15513: variable '__network_provider_setup' from source: role '' defaults 13131 1726867211.15522: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867211.15567: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867211.15574: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867211.15621: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867211.15739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867211.17283: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867211.17286: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867211.17288: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867211.17296: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867211.17329: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867211.17767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.17809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.17834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.17875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.17906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.17955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.17991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.18026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.18100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.18124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.18290: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867211.18367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.18385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.18402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.18430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.18443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.18503: variable 'ansible_python' from source: facts 13131 1726867211.18526: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867211.18581: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867211.18635: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867211.18720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.18736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.18755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.18784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.18794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.18829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.18850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.18868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.18896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.18909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.19003: variable 'network_connections' from source: task vars 13131 1726867211.19011: variable 'controller_profile' from source: play vars 13131 1726867211.19080: variable 'controller_profile' from source: play vars 13131 1726867211.19133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867211.19151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867211.19171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.19197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867211.19234: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867211.19411: variable 'network_connections' from source: task vars 13131 1726867211.19416: variable 'controller_profile' from source: play vars 13131 1726867211.19484: variable 'controller_profile' from source: play vars 13131 1726867211.19510: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867211.19564: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867211.19850: variable 'network_connections' from source: task vars 13131 1726867211.19854: variable 'controller_profile' from source: play vars 13131 1726867211.20082: variable 'controller_profile' from source: play vars 13131 1726867211.20086: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867211.20088: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867211.20333: variable 'network_connections' from source: task vars 13131 1726867211.20336: variable 'controller_profile' from source: play vars 13131 1726867211.20436: variable 'controller_profile' from source: play vars 13131 1726867211.20461: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867211.20518: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867211.20531: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867211.20644: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867211.20810: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867211.21229: variable 'network_connections' from source: task vars 13131 1726867211.21232: variable 'controller_profile' from source: play vars 13131 1726867211.21302: variable 'controller_profile' from source: play vars 13131 1726867211.21313: variable 'ansible_distribution' from source: facts 13131 1726867211.21315: variable '__network_rh_distros' from source: role '' defaults 13131 1726867211.21318: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.21324: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867211.21487: variable 'ansible_distribution' from source: facts 13131 1726867211.21490: variable '__network_rh_distros' from source: role '' defaults 13131 1726867211.21509: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.21512: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867211.21682: variable 'ansible_distribution' from source: facts 13131 1726867211.21685: variable '__network_rh_distros' from source: role '' defaults 13131 1726867211.21688: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.21716: variable 'network_provider' from source: set_fact 13131 1726867211.21731: variable 'ansible_facts' from source: unknown 13131 1726867211.22369: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13131 1726867211.22372: when evaluation is False, skipping this task 13131 1726867211.22374: _execute() done 13131 1726867211.22376: dumping result to json 13131 1726867211.22380: done dumping result, returning 13131 1726867211.22383: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-5f24-9b7a-000000000083] 13131 1726867211.22385: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000083 13131 1726867211.22647: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000083 13131 1726867211.22649: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13131 1726867211.22693: no more pending results, returning what we have 13131 1726867211.22696: results queue empty 13131 1726867211.22697: checking for any_errors_fatal 13131 1726867211.22702: done checking for any_errors_fatal 13131 1726867211.22702: checking for max_fail_percentage 13131 1726867211.22704: done checking for max_fail_percentage 13131 1726867211.22705: checking to see if all hosts have failed and the running result is not ok 13131 1726867211.22705: done checking to see if all hosts have failed 13131 1726867211.22706: getting the remaining hosts for this loop 13131 1726867211.22707: done getting the remaining hosts for this loop 13131 1726867211.22710: getting the next task for host managed_node1 13131 1726867211.22716: done getting next task for host managed_node1 13131 1726867211.22719: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867211.22721: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867211.22737: getting variables 13131 1726867211.22738: in VariableManager get_vars() 13131 1726867211.22789: Calling all_inventory to load vars for managed_node1 13131 1726867211.22792: Calling groups_inventory to load vars for managed_node1 13131 1726867211.22795: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867211.22803: Calling all_plugins_play to load vars for managed_node1 13131 1726867211.22806: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867211.22809: Calling groups_plugins_play to load vars for managed_node1 13131 1726867211.24430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867211.25550: done with get_vars() 13131 1726867211.25566: done getting variables 13131 1726867211.25612: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:20:11 -0400 (0:00:00.121) 0:00:26.366 ****** 13131 1726867211.25637: entering _queue_task() for managed_node1/package 13131 1726867211.25938: worker is 1 (out of 1 available) 13131 1726867211.25951: exiting _queue_task() for managed_node1/package 13131 1726867211.25963: done queuing things up, now waiting for results queue to drain 13131 1726867211.25965: waiting for pending results... 13131 1726867211.26303: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867211.26398: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000084 13131 1726867211.26422: variable 'ansible_search_path' from source: unknown 13131 1726867211.26431: variable 'ansible_search_path' from source: unknown 13131 1726867211.26506: calling self._execute() 13131 1726867211.26581: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867211.26593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867211.26613: variable 'omit' from source: magic vars 13131 1726867211.27000: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.27049: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867211.27143: variable 'network_state' from source: role '' defaults 13131 1726867211.27165: Evaluated conditional (network_state != {}): False 13131 1726867211.27172: when evaluation is False, skipping this task 13131 1726867211.27265: _execute() done 13131 1726867211.27268: dumping result to json 13131 1726867211.27271: done dumping result, returning 13131 1726867211.27273: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-000000000084] 13131 1726867211.27275: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000084 13131 1726867211.27338: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000084 13131 1726867211.27341: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867211.27423: no more pending results, returning what we have 13131 1726867211.27427: results queue empty 13131 1726867211.27428: checking for any_errors_fatal 13131 1726867211.27434: done checking for any_errors_fatal 13131 1726867211.27435: checking for max_fail_percentage 13131 1726867211.27436: done checking for max_fail_percentage 13131 1726867211.27437: checking to see if all hosts have failed and the running result is not ok 13131 1726867211.27438: done checking to see if all hosts have failed 13131 1726867211.27438: getting the remaining hosts for this loop 13131 1726867211.27440: done getting the remaining hosts for this loop 13131 1726867211.27443: getting the next task for host managed_node1 13131 1726867211.27449: done getting next task for host managed_node1 13131 1726867211.27453: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867211.27456: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867211.27472: getting variables 13131 1726867211.27473: in VariableManager get_vars() 13131 1726867211.27519: Calling all_inventory to load vars for managed_node1 13131 1726867211.27522: Calling groups_inventory to load vars for managed_node1 13131 1726867211.27524: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867211.27532: Calling all_plugins_play to load vars for managed_node1 13131 1726867211.27534: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867211.27536: Calling groups_plugins_play to load vars for managed_node1 13131 1726867211.28848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867211.30470: done with get_vars() 13131 1726867211.30493: done getting variables 13131 1726867211.30550: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:20:11 -0400 (0:00:00.049) 0:00:26.416 ****** 13131 1726867211.30584: entering _queue_task() for managed_node1/package 13131 1726867211.31108: worker is 1 (out of 1 available) 13131 1726867211.31116: exiting _queue_task() for managed_node1/package 13131 1726867211.31126: done queuing things up, now waiting for results queue to drain 13131 1726867211.31128: waiting for pending results... 13131 1726867211.31188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867211.31462: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000085 13131 1726867211.31466: variable 'ansible_search_path' from source: unknown 13131 1726867211.31469: variable 'ansible_search_path' from source: unknown 13131 1726867211.31472: calling self._execute() 13131 1726867211.31509: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867211.31520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867211.31534: variable 'omit' from source: magic vars 13131 1726867211.31908: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.31924: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867211.32051: variable 'network_state' from source: role '' defaults 13131 1726867211.32066: Evaluated conditional (network_state != {}): False 13131 1726867211.32074: when evaluation is False, skipping this task 13131 1726867211.32084: _execute() done 13131 1726867211.32092: dumping result to json 13131 1726867211.32098: done dumping result, returning 13131 1726867211.32115: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-000000000085] 13131 1726867211.32124: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000085 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867211.32343: no more pending results, returning what we have 13131 1726867211.32348: results queue empty 13131 1726867211.32349: checking for any_errors_fatal 13131 1726867211.32353: done checking for any_errors_fatal 13131 1726867211.32354: checking for max_fail_percentage 13131 1726867211.32356: done checking for max_fail_percentage 13131 1726867211.32357: checking to see if all hosts have failed and the running result is not ok 13131 1726867211.32358: done checking to see if all hosts have failed 13131 1726867211.32358: getting the remaining hosts for this loop 13131 1726867211.32360: done getting the remaining hosts for this loop 13131 1726867211.32363: getting the next task for host managed_node1 13131 1726867211.32370: done getting next task for host managed_node1 13131 1726867211.32374: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867211.32379: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867211.32398: getting variables 13131 1726867211.32403: in VariableManager get_vars() 13131 1726867211.32457: Calling all_inventory to load vars for managed_node1 13131 1726867211.32460: Calling groups_inventory to load vars for managed_node1 13131 1726867211.32463: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867211.32475: Calling all_plugins_play to load vars for managed_node1 13131 1726867211.32668: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867211.32675: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000085 13131 1726867211.32679: WORKER PROCESS EXITING 13131 1726867211.32684: Calling groups_plugins_play to load vars for managed_node1 13131 1726867211.34090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867211.35674: done with get_vars() 13131 1726867211.35699: done getting variables 13131 1726867211.35758: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:20:11 -0400 (0:00:00.052) 0:00:26.468 ****** 13131 1726867211.35793: entering _queue_task() for managed_node1/service 13131 1726867211.36109: worker is 1 (out of 1 available) 13131 1726867211.36123: exiting _queue_task() for managed_node1/service 13131 1726867211.36135: done queuing things up, now waiting for results queue to drain 13131 1726867211.36136: waiting for pending results... 13131 1726867211.36422: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867211.36562: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000086 13131 1726867211.36584: variable 'ansible_search_path' from source: unknown 13131 1726867211.36591: variable 'ansible_search_path' from source: unknown 13131 1726867211.36638: calling self._execute() 13131 1726867211.36744: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867211.36755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867211.36817: variable 'omit' from source: magic vars 13131 1726867211.37147: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.37162: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867211.37283: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867211.37504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867211.39764: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867211.39845: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867211.39894: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867211.39968: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867211.39971: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867211.40050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.40090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.40122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.40166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.40292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.40295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.40297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.40302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.40335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.40352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.40396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.40429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.40458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.40502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.40526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.40707: variable 'network_connections' from source: task vars 13131 1726867211.40728: variable 'controller_profile' from source: play vars 13131 1726867211.40795: variable 'controller_profile' from source: play vars 13131 1726867211.40876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867211.41282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867211.41286: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867211.41289: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867211.41291: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867211.41293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867211.41294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867211.41296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.41298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867211.41346: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867211.41591: variable 'network_connections' from source: task vars 13131 1726867211.41606: variable 'controller_profile' from source: play vars 13131 1726867211.41673: variable 'controller_profile' from source: play vars 13131 1726867211.41707: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867211.41716: when evaluation is False, skipping this task 13131 1726867211.41722: _execute() done 13131 1726867211.41728: dumping result to json 13131 1726867211.41738: done dumping result, returning 13131 1726867211.41750: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000086] 13131 1726867211.41758: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000086 13131 1726867211.41983: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000086 13131 1726867211.41995: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867211.42047: no more pending results, returning what we have 13131 1726867211.42050: results queue empty 13131 1726867211.42052: checking for any_errors_fatal 13131 1726867211.42060: done checking for any_errors_fatal 13131 1726867211.42061: checking for max_fail_percentage 13131 1726867211.42062: done checking for max_fail_percentage 13131 1726867211.42063: checking to see if all hosts have failed and the running result is not ok 13131 1726867211.42064: done checking to see if all hosts have failed 13131 1726867211.42065: getting the remaining hosts for this loop 13131 1726867211.42066: done getting the remaining hosts for this loop 13131 1726867211.42070: getting the next task for host managed_node1 13131 1726867211.42079: done getting next task for host managed_node1 13131 1726867211.42083: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867211.42086: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867211.42106: getting variables 13131 1726867211.42108: in VariableManager get_vars() 13131 1726867211.42164: Calling all_inventory to load vars for managed_node1 13131 1726867211.42167: Calling groups_inventory to load vars for managed_node1 13131 1726867211.42170: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867211.42182: Calling all_plugins_play to load vars for managed_node1 13131 1726867211.42186: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867211.42189: Calling groups_plugins_play to load vars for managed_node1 13131 1726867211.43738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867211.45346: done with get_vars() 13131 1726867211.45369: done getting variables 13131 1726867211.45429: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:20:11 -0400 (0:00:00.096) 0:00:26.565 ****** 13131 1726867211.45460: entering _queue_task() for managed_node1/service 13131 1726867211.45748: worker is 1 (out of 1 available) 13131 1726867211.45760: exiting _queue_task() for managed_node1/service 13131 1726867211.45773: done queuing things up, now waiting for results queue to drain 13131 1726867211.45775: waiting for pending results... 13131 1726867211.46109: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867211.46275: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000087 13131 1726867211.46301: variable 'ansible_search_path' from source: unknown 13131 1726867211.46374: variable 'ansible_search_path' from source: unknown 13131 1726867211.46380: calling self._execute() 13131 1726867211.46464: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867211.46484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867211.46504: variable 'omit' from source: magic vars 13131 1726867211.46905: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.46932: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867211.47106: variable 'network_provider' from source: set_fact 13131 1726867211.47117: variable 'network_state' from source: role '' defaults 13131 1726867211.47145: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13131 1726867211.47282: variable 'omit' from source: magic vars 13131 1726867211.47285: variable 'omit' from source: magic vars 13131 1726867211.47288: variable 'network_service_name' from source: role '' defaults 13131 1726867211.47324: variable 'network_service_name' from source: role '' defaults 13131 1726867211.47436: variable '__network_provider_setup' from source: role '' defaults 13131 1726867211.47451: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867211.47523: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867211.47538: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867211.47602: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867211.47841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867211.50197: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867211.50280: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867211.50321: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867211.50366: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867211.50403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867211.50682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.50686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.50688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.50690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.50692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.50694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.50696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.50698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.50745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.50763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.50996: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867211.51117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.51152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.51182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.51224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.51253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.51343: variable 'ansible_python' from source: facts 13131 1726867211.51372: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867211.51452: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867211.51745: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867211.51872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.51909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.51939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.51983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.52082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.52085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867211.52095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867211.52123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.52164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867211.52184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867211.52324: variable 'network_connections' from source: task vars 13131 1726867211.52343: variable 'controller_profile' from source: play vars 13131 1726867211.52417: variable 'controller_profile' from source: play vars 13131 1726867211.52527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867211.52732: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867211.52790: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867211.52835: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867211.52984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867211.52988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867211.52990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867211.53012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867211.53048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867211.53102: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867211.53375: variable 'network_connections' from source: task vars 13131 1726867211.53391: variable 'controller_profile' from source: play vars 13131 1726867211.53468: variable 'controller_profile' from source: play vars 13131 1726867211.53506: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867211.53592: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867211.53888: variable 'network_connections' from source: task vars 13131 1726867211.53898: variable 'controller_profile' from source: play vars 13131 1726867211.53972: variable 'controller_profile' from source: play vars 13131 1726867211.54001: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867211.54086: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867211.54373: variable 'network_connections' from source: task vars 13131 1726867211.54390: variable 'controller_profile' from source: play vars 13131 1726867211.54461: variable 'controller_profile' from source: play vars 13131 1726867211.54522: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867211.54610: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867211.54617: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867211.54663: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867211.54890: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867211.55484: variable 'network_connections' from source: task vars 13131 1726867211.55488: variable 'controller_profile' from source: play vars 13131 1726867211.55584: variable 'controller_profile' from source: play vars 13131 1726867211.55588: variable 'ansible_distribution' from source: facts 13131 1726867211.55591: variable '__network_rh_distros' from source: role '' defaults 13131 1726867211.55593: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.55596: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867211.55728: variable 'ansible_distribution' from source: facts 13131 1726867211.55731: variable '__network_rh_distros' from source: role '' defaults 13131 1726867211.55733: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.55739: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867211.55933: variable 'ansible_distribution' from source: facts 13131 1726867211.55937: variable '__network_rh_distros' from source: role '' defaults 13131 1726867211.55942: variable 'ansible_distribution_major_version' from source: facts 13131 1726867211.55980: variable 'network_provider' from source: set_fact 13131 1726867211.56008: variable 'omit' from source: magic vars 13131 1726867211.56039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867211.56068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867211.56087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867211.56108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867211.56118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867211.56152: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867211.56155: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867211.56158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867211.56286: Set connection var ansible_connection to ssh 13131 1726867211.56289: Set connection var ansible_timeout to 10 13131 1726867211.56292: Set connection var ansible_shell_type to sh 13131 1726867211.56294: Set connection var ansible_shell_executable to /bin/sh 13131 1726867211.56296: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867211.56298: Set connection var ansible_pipelining to False 13131 1726867211.56365: variable 'ansible_shell_executable' from source: unknown 13131 1726867211.56369: variable 'ansible_connection' from source: unknown 13131 1726867211.56371: variable 'ansible_module_compression' from source: unknown 13131 1726867211.56373: variable 'ansible_shell_type' from source: unknown 13131 1726867211.56375: variable 'ansible_shell_executable' from source: unknown 13131 1726867211.56379: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867211.56381: variable 'ansible_pipelining' from source: unknown 13131 1726867211.56383: variable 'ansible_timeout' from source: unknown 13131 1726867211.56385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867211.56451: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867211.56474: variable 'omit' from source: magic vars 13131 1726867211.56478: starting attempt loop 13131 1726867211.56481: running the handler 13131 1726867211.56584: variable 'ansible_facts' from source: unknown 13131 1726867211.57274: _low_level_execute_command(): starting 13131 1726867211.57282: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867211.58203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867211.58210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867211.58227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867211.58232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867211.58238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867211.58243: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867211.58283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867211.58286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867211.58289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867211.58291: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867211.58348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867211.58373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867211.58465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867211.60145: stdout chunk (state=3): >>>/root <<< 13131 1726867211.60299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867211.60302: stdout chunk (state=3): >>><<< 13131 1726867211.60304: stderr chunk (state=3): >>><<< 13131 1726867211.60322: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867211.60411: _low_level_execute_command(): starting 13131 1726867211.60415: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272 `" && echo ansible-tmp-1726867211.6032798-14484-274795525409272="` echo /root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272 `" ) && sleep 0' 13131 1726867211.60971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867211.60989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867211.61051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867211.61119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867211.61134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867211.61167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867211.61255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867211.63226: stdout chunk (state=3): >>>ansible-tmp-1726867211.6032798-14484-274795525409272=/root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272 <<< 13131 1726867211.63383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867211.63386: stdout chunk (state=3): >>><<< 13131 1726867211.63388: stderr chunk (state=3): >>><<< 13131 1726867211.63402: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867211.6032798-14484-274795525409272=/root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867211.63587: variable 'ansible_module_compression' from source: unknown 13131 1726867211.63590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13131 1726867211.63592: variable 'ansible_facts' from source: unknown 13131 1726867211.63761: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/AnsiballZ_systemd.py 13131 1726867211.64037: Sending initial data 13131 1726867211.64048: Sent initial data (156 bytes) 13131 1726867211.64582: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867211.64600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867211.64604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867211.64663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867211.64691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867211.64710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867211.64794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867211.66447: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13131 1726867211.66461: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867211.66495: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867211.66547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpx5r8ef3h /root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/AnsiballZ_systemd.py <<< 13131 1726867211.66550: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/AnsiballZ_systemd.py" <<< 13131 1726867211.66594: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpx5r8ef3h" to remote "/root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/AnsiballZ_systemd.py" <<< 13131 1726867211.66597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/AnsiballZ_systemd.py" <<< 13131 1726867211.67727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867211.67763: stderr chunk (state=3): >>><<< 13131 1726867211.67767: stdout chunk (state=3): >>><<< 13131 1726867211.67769: done transferring module to remote 13131 1726867211.67775: _low_level_execute_command(): starting 13131 1726867211.67785: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/ /root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/AnsiballZ_systemd.py && sleep 0' 13131 1726867211.68720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867211.68723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867211.68725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867211.68728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867211.68730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867211.68818: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867211.68822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867211.68824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867211.68827: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867211.68884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867211.68887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867211.68890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867211.68961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867211.70698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867211.70719: stderr chunk (state=3): >>><<< 13131 1726867211.70725: stdout chunk (state=3): >>><<< 13131 1726867211.70739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867211.70746: _low_level_execute_command(): starting 13131 1726867211.70753: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/AnsiballZ_systemd.py && sleep 0' 13131 1726867211.71324: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867211.71597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867211.71984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867212.00772: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3309039616", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "769047000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 13131 1726867212.00810: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-br<<< 13131 1726867212.00818: stdout chunk (state=3): >>>oker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13131 1726867212.02622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867212.02644: stderr chunk (state=3): >>><<< 13131 1726867212.02647: stdout chunk (state=3): >>><<< 13131 1726867212.02663: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3309039616", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "769047000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867212.02787: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867212.02806: _low_level_execute_command(): starting 13131 1726867212.02809: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867211.6032798-14484-274795525409272/ > /dev/null 2>&1 && sleep 0' 13131 1726867212.03237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867212.03241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867212.03243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867212.03246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867212.03248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867212.03283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867212.03297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867212.03345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867212.05130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867212.05150: stderr chunk (state=3): >>><<< 13131 1726867212.05153: stdout chunk (state=3): >>><<< 13131 1726867212.05165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867212.05171: handler run complete 13131 1726867212.05217: attempt loop complete, returning result 13131 1726867212.05220: _execute() done 13131 1726867212.05223: dumping result to json 13131 1726867212.05234: done dumping result, returning 13131 1726867212.05242: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-5f24-9b7a-000000000087] 13131 1726867212.05246: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000087 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867212.05742: no more pending results, returning what we have 13131 1726867212.05744: results queue empty 13131 1726867212.05744: checking for any_errors_fatal 13131 1726867212.05747: done checking for any_errors_fatal 13131 1726867212.05747: checking for max_fail_percentage 13131 1726867212.05748: done checking for max_fail_percentage 13131 1726867212.05749: checking to see if all hosts have failed and the running result is not ok 13131 1726867212.05749: done checking to see if all hosts have failed 13131 1726867212.05749: getting the remaining hosts for this loop 13131 1726867212.05750: done getting the remaining hosts for this loop 13131 1726867212.05753: getting the next task for host managed_node1 13131 1726867212.05756: done getting next task for host managed_node1 13131 1726867212.05758: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867212.05760: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867212.05769: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000087 13131 1726867212.05772: WORKER PROCESS EXITING 13131 1726867212.05778: getting variables 13131 1726867212.05779: in VariableManager get_vars() 13131 1726867212.05815: Calling all_inventory to load vars for managed_node1 13131 1726867212.05817: Calling groups_inventory to load vars for managed_node1 13131 1726867212.05819: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867212.05826: Calling all_plugins_play to load vars for managed_node1 13131 1726867212.05827: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867212.05829: Calling groups_plugins_play to load vars for managed_node1 13131 1726867212.06502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867212.07672: done with get_vars() 13131 1726867212.07693: done getting variables 13131 1726867212.07736: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:20:12 -0400 (0:00:00.623) 0:00:27.188 ****** 13131 1726867212.07761: entering _queue_task() for managed_node1/service 13131 1726867212.08010: worker is 1 (out of 1 available) 13131 1726867212.08024: exiting _queue_task() for managed_node1/service 13131 1726867212.08037: done queuing things up, now waiting for results queue to drain 13131 1726867212.08038: waiting for pending results... 13131 1726867212.08229: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867212.08325: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000088 13131 1726867212.08337: variable 'ansible_search_path' from source: unknown 13131 1726867212.08340: variable 'ansible_search_path' from source: unknown 13131 1726867212.08376: calling self._execute() 13131 1726867212.08449: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867212.08455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867212.08462: variable 'omit' from source: magic vars 13131 1726867212.08742: variable 'ansible_distribution_major_version' from source: facts 13131 1726867212.08752: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867212.08834: variable 'network_provider' from source: set_fact 13131 1726867212.08838: Evaluated conditional (network_provider == "nm"): True 13131 1726867212.08902: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867212.08967: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867212.09141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867212.12385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867212.12390: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867212.12393: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867212.12395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867212.12398: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867212.12604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867212.12643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867212.12681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867212.12732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867212.12755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867212.12816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867212.12848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867212.12883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867212.12932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867212.12955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867212.13010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867212.13040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867212.13069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867212.13116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867212.13137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867212.13426: variable 'network_connections' from source: task vars 13131 1726867212.13455: variable 'controller_profile' from source: play vars 13131 1726867212.13533: variable 'controller_profile' from source: play vars 13131 1726867212.13621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867212.13795: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867212.13841: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867212.13884: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867212.13922: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867212.13974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867212.14010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867212.14042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867212.14083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867212.14138: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867212.14411: variable 'network_connections' from source: task vars 13131 1726867212.14450: variable 'controller_profile' from source: play vars 13131 1726867212.14523: variable 'controller_profile' from source: play vars 13131 1726867212.14558: Evaluated conditional (__network_wpa_supplicant_required): False 13131 1726867212.14575: when evaluation is False, skipping this task 13131 1726867212.14581: _execute() done 13131 1726867212.14583: dumping result to json 13131 1726867212.14586: done dumping result, returning 13131 1726867212.14589: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-5f24-9b7a-000000000088] 13131 1726867212.14599: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000088 13131 1726867212.14686: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000088 13131 1726867212.14688: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13131 1726867212.14755: no more pending results, returning what we have 13131 1726867212.14764: results queue empty 13131 1726867212.14766: checking for any_errors_fatal 13131 1726867212.14788: done checking for any_errors_fatal 13131 1726867212.14788: checking for max_fail_percentage 13131 1726867212.14790: done checking for max_fail_percentage 13131 1726867212.14791: checking to see if all hosts have failed and the running result is not ok 13131 1726867212.14791: done checking to see if all hosts have failed 13131 1726867212.14792: getting the remaining hosts for this loop 13131 1726867212.14793: done getting the remaining hosts for this loop 13131 1726867212.14797: getting the next task for host managed_node1 13131 1726867212.14803: done getting next task for host managed_node1 13131 1726867212.14806: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867212.14809: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867212.14824: getting variables 13131 1726867212.14826: in VariableManager get_vars() 13131 1726867212.14880: Calling all_inventory to load vars for managed_node1 13131 1726867212.14882: Calling groups_inventory to load vars for managed_node1 13131 1726867212.14885: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867212.14893: Calling all_plugins_play to load vars for managed_node1 13131 1726867212.14895: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867212.14897: Calling groups_plugins_play to load vars for managed_node1 13131 1726867212.15742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867212.16594: done with get_vars() 13131 1726867212.16612: done getting variables 13131 1726867212.16652: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:20:12 -0400 (0:00:00.089) 0:00:27.277 ****** 13131 1726867212.16673: entering _queue_task() for managed_node1/service 13131 1726867212.16878: worker is 1 (out of 1 available) 13131 1726867212.16892: exiting _queue_task() for managed_node1/service 13131 1726867212.16903: done queuing things up, now waiting for results queue to drain 13131 1726867212.16904: waiting for pending results... 13131 1726867212.17082: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867212.17168: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000089 13131 1726867212.17180: variable 'ansible_search_path' from source: unknown 13131 1726867212.17184: variable 'ansible_search_path' from source: unknown 13131 1726867212.17216: calling self._execute() 13131 1726867212.17291: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867212.17295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867212.17307: variable 'omit' from source: magic vars 13131 1726867212.17586: variable 'ansible_distribution_major_version' from source: facts 13131 1726867212.17589: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867212.17656: variable 'network_provider' from source: set_fact 13131 1726867212.17660: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867212.17663: when evaluation is False, skipping this task 13131 1726867212.17665: _execute() done 13131 1726867212.17669: dumping result to json 13131 1726867212.17672: done dumping result, returning 13131 1726867212.17679: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-5f24-9b7a-000000000089] 13131 1726867212.17684: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000089 13131 1726867212.17768: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000089 13131 1726867212.17771: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867212.17821: no more pending results, returning what we have 13131 1726867212.17824: results queue empty 13131 1726867212.17825: checking for any_errors_fatal 13131 1726867212.17832: done checking for any_errors_fatal 13131 1726867212.17832: checking for max_fail_percentage 13131 1726867212.17834: done checking for max_fail_percentage 13131 1726867212.17834: checking to see if all hosts have failed and the running result is not ok 13131 1726867212.17835: done checking to see if all hosts have failed 13131 1726867212.17836: getting the remaining hosts for this loop 13131 1726867212.17837: done getting the remaining hosts for this loop 13131 1726867212.17840: getting the next task for host managed_node1 13131 1726867212.17847: done getting next task for host managed_node1 13131 1726867212.17849: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867212.17852: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867212.17867: getting variables 13131 1726867212.17868: in VariableManager get_vars() 13131 1726867212.17909: Calling all_inventory to load vars for managed_node1 13131 1726867212.17912: Calling groups_inventory to load vars for managed_node1 13131 1726867212.17914: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867212.17923: Calling all_plugins_play to load vars for managed_node1 13131 1726867212.17925: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867212.17927: Calling groups_plugins_play to load vars for managed_node1 13131 1726867212.18645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867212.19498: done with get_vars() 13131 1726867212.19512: done getting variables 13131 1726867212.19551: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:20:12 -0400 (0:00:00.028) 0:00:27.306 ****** 13131 1726867212.19572: entering _queue_task() for managed_node1/copy 13131 1726867212.19768: worker is 1 (out of 1 available) 13131 1726867212.19782: exiting _queue_task() for managed_node1/copy 13131 1726867212.19792: done queuing things up, now waiting for results queue to drain 13131 1726867212.19793: waiting for pending results... 13131 1726867212.19960: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867212.20052: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000008a 13131 1726867212.20063: variable 'ansible_search_path' from source: unknown 13131 1726867212.20066: variable 'ansible_search_path' from source: unknown 13131 1726867212.20095: calling self._execute() 13131 1726867212.20166: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867212.20170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867212.20182: variable 'omit' from source: magic vars 13131 1726867212.20457: variable 'ansible_distribution_major_version' from source: facts 13131 1726867212.20475: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867212.20547: variable 'network_provider' from source: set_fact 13131 1726867212.20550: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867212.20553: when evaluation is False, skipping this task 13131 1726867212.20557: _execute() done 13131 1726867212.20560: dumping result to json 13131 1726867212.20562: done dumping result, returning 13131 1726867212.20572: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-5f24-9b7a-00000000008a] 13131 1726867212.20576: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008a 13131 1726867212.20657: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008a 13131 1726867212.20660: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867212.20720: no more pending results, returning what we have 13131 1726867212.20723: results queue empty 13131 1726867212.20724: checking for any_errors_fatal 13131 1726867212.20727: done checking for any_errors_fatal 13131 1726867212.20728: checking for max_fail_percentage 13131 1726867212.20729: done checking for max_fail_percentage 13131 1726867212.20730: checking to see if all hosts have failed and the running result is not ok 13131 1726867212.20731: done checking to see if all hosts have failed 13131 1726867212.20731: getting the remaining hosts for this loop 13131 1726867212.20732: done getting the remaining hosts for this loop 13131 1726867212.20735: getting the next task for host managed_node1 13131 1726867212.20740: done getting next task for host managed_node1 13131 1726867212.20743: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867212.20746: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867212.20760: getting variables 13131 1726867212.20761: in VariableManager get_vars() 13131 1726867212.20802: Calling all_inventory to load vars for managed_node1 13131 1726867212.20805: Calling groups_inventory to load vars for managed_node1 13131 1726867212.20807: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867212.20815: Calling all_plugins_play to load vars for managed_node1 13131 1726867212.20817: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867212.20819: Calling groups_plugins_play to load vars for managed_node1 13131 1726867212.21622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867212.22966: done with get_vars() 13131 1726867212.22983: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:20:12 -0400 (0:00:00.034) 0:00:27.340 ****** 13131 1726867212.23039: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867212.23221: worker is 1 (out of 1 available) 13131 1726867212.23234: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867212.23244: done queuing things up, now waiting for results queue to drain 13131 1726867212.23246: waiting for pending results... 13131 1726867212.23418: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867212.23508: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000008b 13131 1726867212.23521: variable 'ansible_search_path' from source: unknown 13131 1726867212.23524: variable 'ansible_search_path' from source: unknown 13131 1726867212.23551: calling self._execute() 13131 1726867212.23627: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867212.23631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867212.23640: variable 'omit' from source: magic vars 13131 1726867212.23905: variable 'ansible_distribution_major_version' from source: facts 13131 1726867212.23918: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867212.23923: variable 'omit' from source: magic vars 13131 1726867212.23956: variable 'omit' from source: magic vars 13131 1726867212.24070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867212.25982: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867212.25985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867212.25988: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867212.25990: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867212.25992: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867212.26067: variable 'network_provider' from source: set_fact 13131 1726867212.26192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867212.26238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867212.26267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867212.26313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867212.26332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867212.26412: variable 'omit' from source: magic vars 13131 1726867212.26523: variable 'omit' from source: magic vars 13131 1726867212.26626: variable 'network_connections' from source: task vars 13131 1726867212.26643: variable 'controller_profile' from source: play vars 13131 1726867212.26711: variable 'controller_profile' from source: play vars 13131 1726867212.26855: variable 'omit' from source: magic vars 13131 1726867212.26867: variable '__lsr_ansible_managed' from source: task vars 13131 1726867212.26930: variable '__lsr_ansible_managed' from source: task vars 13131 1726867212.27094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13131 1726867212.27308: Loaded config def from plugin (lookup/template) 13131 1726867212.27317: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13131 1726867212.27346: File lookup term: get_ansible_managed.j2 13131 1726867212.27353: variable 'ansible_search_path' from source: unknown 13131 1726867212.27362: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13131 1726867212.27380: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13131 1726867212.27402: variable 'ansible_search_path' from source: unknown 13131 1726867212.38933: variable 'ansible_managed' from source: unknown 13131 1726867212.39065: variable 'omit' from source: magic vars 13131 1726867212.39158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867212.39161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867212.39163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867212.39166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867212.39168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867212.39170: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867212.39173: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867212.39178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867212.39272: Set connection var ansible_connection to ssh 13131 1726867212.39280: Set connection var ansible_timeout to 10 13131 1726867212.39283: Set connection var ansible_shell_type to sh 13131 1726867212.39384: Set connection var ansible_shell_executable to /bin/sh 13131 1726867212.39387: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867212.39390: Set connection var ansible_pipelining to False 13131 1726867212.39392: variable 'ansible_shell_executable' from source: unknown 13131 1726867212.39394: variable 'ansible_connection' from source: unknown 13131 1726867212.39396: variable 'ansible_module_compression' from source: unknown 13131 1726867212.39398: variable 'ansible_shell_type' from source: unknown 13131 1726867212.39401: variable 'ansible_shell_executable' from source: unknown 13131 1726867212.39403: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867212.39405: variable 'ansible_pipelining' from source: unknown 13131 1726867212.39408: variable 'ansible_timeout' from source: unknown 13131 1726867212.39410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867212.39501: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867212.39517: variable 'omit' from source: magic vars 13131 1726867212.39527: starting attempt loop 13131 1726867212.39530: running the handler 13131 1726867212.39541: _low_level_execute_command(): starting 13131 1726867212.39546: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867212.40404: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867212.40408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867212.40411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867212.40413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867212.40542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867212.42147: stdout chunk (state=3): >>>/root <<< 13131 1726867212.42312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867212.42316: stdout chunk (state=3): >>><<< 13131 1726867212.42318: stderr chunk (state=3): >>><<< 13131 1726867212.42338: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867212.42357: _low_level_execute_command(): starting 13131 1726867212.42368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566 `" && echo ansible-tmp-1726867212.4234471-14527-163209719327566="` echo /root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566 `" ) && sleep 0' 13131 1726867212.42952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867212.42966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867212.42985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867212.43005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867212.43023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867212.43036: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867212.43050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867212.43069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867212.43084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867212.43096: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867212.43112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867212.43197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867212.43223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867212.43239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867212.43317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867212.45227: stdout chunk (state=3): >>>ansible-tmp-1726867212.4234471-14527-163209719327566=/root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566 <<< 13131 1726867212.45361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867212.45372: stdout chunk (state=3): >>><<< 13131 1726867212.45391: stderr chunk (state=3): >>><<< 13131 1726867212.45412: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867212.4234471-14527-163209719327566=/root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867212.45454: variable 'ansible_module_compression' from source: unknown 13131 1726867212.45498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13131 1726867212.45552: variable 'ansible_facts' from source: unknown 13131 1726867212.45704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/AnsiballZ_network_connections.py 13131 1726867212.45922: Sending initial data 13131 1726867212.45936: Sent initial data (168 bytes) 13131 1726867212.46506: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867212.46565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867212.46592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867212.46613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867212.46679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867212.48232: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867212.48303: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867212.48355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp9a5q8iwf /root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/AnsiballZ_network_connections.py <<< 13131 1726867212.48365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/AnsiballZ_network_connections.py" <<< 13131 1726867212.48417: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp9a5q8iwf" to remote "/root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/AnsiballZ_network_connections.py" <<< 13131 1726867212.49618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867212.49656: stderr chunk (state=3): >>><<< 13131 1726867212.49663: stdout chunk (state=3): >>><<< 13131 1726867212.49732: done transferring module to remote 13131 1726867212.49745: _low_level_execute_command(): starting 13131 1726867212.49812: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/ /root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/AnsiballZ_network_connections.py && sleep 0' 13131 1726867212.50345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867212.50360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867212.50386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867212.50498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867212.50520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867212.50537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867212.50620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867212.52392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867212.52410: stderr chunk (state=3): >>><<< 13131 1726867212.52413: stdout chunk (state=3): >>><<< 13131 1726867212.52422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867212.52427: _low_level_execute_command(): starting 13131 1726867212.52461: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/AnsiballZ_network_connections.py && sleep 0' 13131 1726867212.52847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867212.52850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867212.52853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867212.52855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867212.52857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867212.52913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867212.52917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867212.52962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867212.96610: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xk3sng6z/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xk3sng6z/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/df9218e5-fcde-46a8-b91e-9607fcfd47af: error=unknown <<< 13131 1726867212.96844: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13131 1726867212.98674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867212.98703: stderr chunk (state=3): >>><<< 13131 1726867212.98707: stdout chunk (state=3): >>><<< 13131 1726867212.98720: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xk3sng6z/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xk3sng6z/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/df9218e5-fcde-46a8-b91e-9607fcfd47af: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867212.98746: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867212.98753: _low_level_execute_command(): starting 13131 1726867212.98758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867212.4234471-14527-163209719327566/ > /dev/null 2>&1 && sleep 0' 13131 1726867212.99164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867212.99168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867212.99202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867212.99206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867212.99209: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867212.99211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867212.99260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867212.99264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867212.99313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.01115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.01136: stderr chunk (state=3): >>><<< 13131 1726867213.01141: stdout chunk (state=3): >>><<< 13131 1726867213.01157: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.01163: handler run complete 13131 1726867213.01183: attempt loop complete, returning result 13131 1726867213.01186: _execute() done 13131 1726867213.01188: dumping result to json 13131 1726867213.01192: done dumping result, returning 13131 1726867213.01200: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-5f24-9b7a-00000000008b] 13131 1726867213.01206: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008b 13131 1726867213.01305: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008b 13131 1726867213.01308: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13131 1726867213.01398: no more pending results, returning what we have 13131 1726867213.01402: results queue empty 13131 1726867213.01402: checking for any_errors_fatal 13131 1726867213.01408: done checking for any_errors_fatal 13131 1726867213.01408: checking for max_fail_percentage 13131 1726867213.01410: done checking for max_fail_percentage 13131 1726867213.01410: checking to see if all hosts have failed and the running result is not ok 13131 1726867213.01411: done checking to see if all hosts have failed 13131 1726867213.01412: getting the remaining hosts for this loop 13131 1726867213.01413: done getting the remaining hosts for this loop 13131 1726867213.01416: getting the next task for host managed_node1 13131 1726867213.01428: done getting next task for host managed_node1 13131 1726867213.01431: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867213.01434: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867213.01445: getting variables 13131 1726867213.01446: in VariableManager get_vars() 13131 1726867213.01494: Calling all_inventory to load vars for managed_node1 13131 1726867213.01496: Calling groups_inventory to load vars for managed_node1 13131 1726867213.01498: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867213.01507: Calling all_plugins_play to load vars for managed_node1 13131 1726867213.01510: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867213.01512: Calling groups_plugins_play to load vars for managed_node1 13131 1726867213.02348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867213.03308: done with get_vars() 13131 1726867213.03325: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:13 -0400 (0:00:00.803) 0:00:28.144 ****** 13131 1726867213.03385: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867213.03623: worker is 1 (out of 1 available) 13131 1726867213.03638: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867213.03651: done queuing things up, now waiting for results queue to drain 13131 1726867213.03653: waiting for pending results... 13131 1726867213.03833: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867213.03924: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000008c 13131 1726867213.03937: variable 'ansible_search_path' from source: unknown 13131 1726867213.03940: variable 'ansible_search_path' from source: unknown 13131 1726867213.03968: calling self._execute() 13131 1726867213.04042: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.04047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.04057: variable 'omit' from source: magic vars 13131 1726867213.04325: variable 'ansible_distribution_major_version' from source: facts 13131 1726867213.04335: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867213.04420: variable 'network_state' from source: role '' defaults 13131 1726867213.04429: Evaluated conditional (network_state != {}): False 13131 1726867213.04432: when evaluation is False, skipping this task 13131 1726867213.04435: _execute() done 13131 1726867213.04437: dumping result to json 13131 1726867213.04439: done dumping result, returning 13131 1726867213.04442: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-5f24-9b7a-00000000008c] 13131 1726867213.04448: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008c 13131 1726867213.04535: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008c 13131 1726867213.04538: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867213.04587: no more pending results, returning what we have 13131 1726867213.04590: results queue empty 13131 1726867213.04592: checking for any_errors_fatal 13131 1726867213.04600: done checking for any_errors_fatal 13131 1726867213.04600: checking for max_fail_percentage 13131 1726867213.04602: done checking for max_fail_percentage 13131 1726867213.04603: checking to see if all hosts have failed and the running result is not ok 13131 1726867213.04603: done checking to see if all hosts have failed 13131 1726867213.04604: getting the remaining hosts for this loop 13131 1726867213.04605: done getting the remaining hosts for this loop 13131 1726867213.04608: getting the next task for host managed_node1 13131 1726867213.04615: done getting next task for host managed_node1 13131 1726867213.04620: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867213.04623: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867213.04639: getting variables 13131 1726867213.04641: in VariableManager get_vars() 13131 1726867213.04682: Calling all_inventory to load vars for managed_node1 13131 1726867213.04684: Calling groups_inventory to load vars for managed_node1 13131 1726867213.04686: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867213.04694: Calling all_plugins_play to load vars for managed_node1 13131 1726867213.04696: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867213.04699: Calling groups_plugins_play to load vars for managed_node1 13131 1726867213.05441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867213.06293: done with get_vars() 13131 1726867213.06308: done getting variables 13131 1726867213.06348: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:13 -0400 (0:00:00.029) 0:00:28.174 ****** 13131 1726867213.06371: entering _queue_task() for managed_node1/debug 13131 1726867213.06575: worker is 1 (out of 1 available) 13131 1726867213.06592: exiting _queue_task() for managed_node1/debug 13131 1726867213.06601: done queuing things up, now waiting for results queue to drain 13131 1726867213.06602: waiting for pending results... 13131 1726867213.06771: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867213.06858: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000008d 13131 1726867213.06870: variable 'ansible_search_path' from source: unknown 13131 1726867213.06873: variable 'ansible_search_path' from source: unknown 13131 1726867213.06902: calling self._execute() 13131 1726867213.06972: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.06979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.06989: variable 'omit' from source: magic vars 13131 1726867213.07244: variable 'ansible_distribution_major_version' from source: facts 13131 1726867213.07255: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867213.07266: variable 'omit' from source: magic vars 13131 1726867213.07301: variable 'omit' from source: magic vars 13131 1726867213.07330: variable 'omit' from source: magic vars 13131 1726867213.07359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867213.07389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867213.07407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867213.07421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.07430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.07455: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867213.07458: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.07461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.07531: Set connection var ansible_connection to ssh 13131 1726867213.07537: Set connection var ansible_timeout to 10 13131 1726867213.07540: Set connection var ansible_shell_type to sh 13131 1726867213.07547: Set connection var ansible_shell_executable to /bin/sh 13131 1726867213.07554: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867213.07559: Set connection var ansible_pipelining to False 13131 1726867213.07576: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.07580: variable 'ansible_connection' from source: unknown 13131 1726867213.07583: variable 'ansible_module_compression' from source: unknown 13131 1726867213.07587: variable 'ansible_shell_type' from source: unknown 13131 1726867213.07590: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.07593: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.07595: variable 'ansible_pipelining' from source: unknown 13131 1726867213.07597: variable 'ansible_timeout' from source: unknown 13131 1726867213.07600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.07697: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867213.07710: variable 'omit' from source: magic vars 13131 1726867213.07713: starting attempt loop 13131 1726867213.07716: running the handler 13131 1726867213.07807: variable '__network_connections_result' from source: set_fact 13131 1726867213.07848: handler run complete 13131 1726867213.07861: attempt loop complete, returning result 13131 1726867213.07864: _execute() done 13131 1726867213.07866: dumping result to json 13131 1726867213.07869: done dumping result, returning 13131 1726867213.07878: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-5f24-9b7a-00000000008d] 13131 1726867213.07882: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008d 13131 1726867213.07961: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008d 13131 1726867213.07963: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 13131 1726867213.08030: no more pending results, returning what we have 13131 1726867213.08033: results queue empty 13131 1726867213.08034: checking for any_errors_fatal 13131 1726867213.08038: done checking for any_errors_fatal 13131 1726867213.08039: checking for max_fail_percentage 13131 1726867213.08040: done checking for max_fail_percentage 13131 1726867213.08041: checking to see if all hosts have failed and the running result is not ok 13131 1726867213.08042: done checking to see if all hosts have failed 13131 1726867213.08042: getting the remaining hosts for this loop 13131 1726867213.08043: done getting the remaining hosts for this loop 13131 1726867213.08047: getting the next task for host managed_node1 13131 1726867213.08053: done getting next task for host managed_node1 13131 1726867213.08056: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867213.08059: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867213.08068: getting variables 13131 1726867213.08069: in VariableManager get_vars() 13131 1726867213.08118: Calling all_inventory to load vars for managed_node1 13131 1726867213.08121: Calling groups_inventory to load vars for managed_node1 13131 1726867213.08123: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867213.08130: Calling all_plugins_play to load vars for managed_node1 13131 1726867213.08132: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867213.08134: Calling groups_plugins_play to load vars for managed_node1 13131 1726867213.08973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867213.13418: done with get_vars() 13131 1726867213.13443: done getting variables 13131 1726867213.13511: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:13 -0400 (0:00:00.071) 0:00:28.246 ****** 13131 1726867213.13545: entering _queue_task() for managed_node1/debug 13131 1726867213.13809: worker is 1 (out of 1 available) 13131 1726867213.13822: exiting _queue_task() for managed_node1/debug 13131 1726867213.13834: done queuing things up, now waiting for results queue to drain 13131 1726867213.13835: waiting for pending results... 13131 1726867213.14026: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867213.14124: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000008e 13131 1726867213.14135: variable 'ansible_search_path' from source: unknown 13131 1726867213.14139: variable 'ansible_search_path' from source: unknown 13131 1726867213.14173: calling self._execute() 13131 1726867213.14252: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.14256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.14265: variable 'omit' from source: magic vars 13131 1726867213.14546: variable 'ansible_distribution_major_version' from source: facts 13131 1726867213.14556: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867213.14562: variable 'omit' from source: magic vars 13131 1726867213.14782: variable 'omit' from source: magic vars 13131 1726867213.14786: variable 'omit' from source: magic vars 13131 1726867213.14789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867213.14792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867213.14794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867213.14797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.14800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.14823: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867213.14831: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.14839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.15083: Set connection var ansible_connection to ssh 13131 1726867213.15099: Set connection var ansible_timeout to 10 13131 1726867213.15109: Set connection var ansible_shell_type to sh 13131 1726867213.15125: Set connection var ansible_shell_executable to /bin/sh 13131 1726867213.15141: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867213.15154: Set connection var ansible_pipelining to False 13131 1726867213.15184: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.15194: variable 'ansible_connection' from source: unknown 13131 1726867213.15203: variable 'ansible_module_compression' from source: unknown 13131 1726867213.15212: variable 'ansible_shell_type' from source: unknown 13131 1726867213.15220: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.15228: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.15237: variable 'ansible_pipelining' from source: unknown 13131 1726867213.15246: variable 'ansible_timeout' from source: unknown 13131 1726867213.15256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.15417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867213.15883: variable 'omit' from source: magic vars 13131 1726867213.15887: starting attempt loop 13131 1726867213.15890: running the handler 13131 1726867213.15892: variable '__network_connections_result' from source: set_fact 13131 1726867213.15895: variable '__network_connections_result' from source: set_fact 13131 1726867213.16283: handler run complete 13131 1726867213.16287: attempt loop complete, returning result 13131 1726867213.16290: _execute() done 13131 1726867213.16292: dumping result to json 13131 1726867213.16294: done dumping result, returning 13131 1726867213.16297: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-5f24-9b7a-00000000008e] 13131 1726867213.16300: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008e 13131 1726867213.16374: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008e 13131 1726867213.16381: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13131 1726867213.16492: no more pending results, returning what we have 13131 1726867213.16496: results queue empty 13131 1726867213.16497: checking for any_errors_fatal 13131 1726867213.16509: done checking for any_errors_fatal 13131 1726867213.16510: checking for max_fail_percentage 13131 1726867213.16517: done checking for max_fail_percentage 13131 1726867213.16518: checking to see if all hosts have failed and the running result is not ok 13131 1726867213.16519: done checking to see if all hosts have failed 13131 1726867213.16520: getting the remaining hosts for this loop 13131 1726867213.16522: done getting the remaining hosts for this loop 13131 1726867213.16524: getting the next task for host managed_node1 13131 1726867213.16531: done getting next task for host managed_node1 13131 1726867213.16534: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867213.16538: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867213.16550: getting variables 13131 1726867213.16552: in VariableManager get_vars() 13131 1726867213.16646: Calling all_inventory to load vars for managed_node1 13131 1726867213.16649: Calling groups_inventory to load vars for managed_node1 13131 1726867213.16651: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867213.16660: Calling all_plugins_play to load vars for managed_node1 13131 1726867213.16662: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867213.16665: Calling groups_plugins_play to load vars for managed_node1 13131 1726867213.18495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867213.20426: done with get_vars() 13131 1726867213.20453: done getting variables 13131 1726867213.20532: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:13 -0400 (0:00:00.070) 0:00:28.316 ****** 13131 1726867213.20574: entering _queue_task() for managed_node1/debug 13131 1726867213.20931: worker is 1 (out of 1 available) 13131 1726867213.20945: exiting _queue_task() for managed_node1/debug 13131 1726867213.20955: done queuing things up, now waiting for results queue to drain 13131 1726867213.20961: waiting for pending results... 13131 1726867213.21173: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867213.21364: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000008f 13131 1726867213.21380: variable 'ansible_search_path' from source: unknown 13131 1726867213.21383: variable 'ansible_search_path' from source: unknown 13131 1726867213.21433: calling self._execute() 13131 1726867213.21542: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.21549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.21560: variable 'omit' from source: magic vars 13131 1726867213.21980: variable 'ansible_distribution_major_version' from source: facts 13131 1726867213.21991: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867213.22117: variable 'network_state' from source: role '' defaults 13131 1726867213.22134: Evaluated conditional (network_state != {}): False 13131 1726867213.22138: when evaluation is False, skipping this task 13131 1726867213.22141: _execute() done 13131 1726867213.22143: dumping result to json 13131 1726867213.22146: done dumping result, returning 13131 1726867213.22152: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-5f24-9b7a-00000000008f] 13131 1726867213.22156: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008f skipping: [managed_node1] => { "false_condition": "network_state != {}" } 13131 1726867213.22313: no more pending results, returning what we have 13131 1726867213.22317: results queue empty 13131 1726867213.22318: checking for any_errors_fatal 13131 1726867213.22326: done checking for any_errors_fatal 13131 1726867213.22327: checking for max_fail_percentage 13131 1726867213.22330: done checking for max_fail_percentage 13131 1726867213.22331: checking to see if all hosts have failed and the running result is not ok 13131 1726867213.22332: done checking to see if all hosts have failed 13131 1726867213.22333: getting the remaining hosts for this loop 13131 1726867213.22334: done getting the remaining hosts for this loop 13131 1726867213.22337: getting the next task for host managed_node1 13131 1726867213.22342: done getting next task for host managed_node1 13131 1726867213.22346: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867213.22349: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867213.22363: getting variables 13131 1726867213.22365: in VariableManager get_vars() 13131 1726867213.22409: Calling all_inventory to load vars for managed_node1 13131 1726867213.22412: Calling groups_inventory to load vars for managed_node1 13131 1726867213.22414: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867213.22422: Calling all_plugins_play to load vars for managed_node1 13131 1726867213.22424: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867213.22426: Calling groups_plugins_play to load vars for managed_node1 13131 1726867213.23312: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000008f 13131 1726867213.23316: WORKER PROCESS EXITING 13131 1726867213.23326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867213.24303: done with get_vars() 13131 1726867213.24324: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:13 -0400 (0:00:00.038) 0:00:28.354 ****** 13131 1726867213.24418: entering _queue_task() for managed_node1/ping 13131 1726867213.24696: worker is 1 (out of 1 available) 13131 1726867213.24711: exiting _queue_task() for managed_node1/ping 13131 1726867213.24721: done queuing things up, now waiting for results queue to drain 13131 1726867213.24723: waiting for pending results... 13131 1726867213.24996: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867213.25144: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000090 13131 1726867213.25164: variable 'ansible_search_path' from source: unknown 13131 1726867213.25173: variable 'ansible_search_path' from source: unknown 13131 1726867213.25288: calling self._execute() 13131 1726867213.25331: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.25336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.25346: variable 'omit' from source: magic vars 13131 1726867213.25632: variable 'ansible_distribution_major_version' from source: facts 13131 1726867213.25641: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867213.25647: variable 'omit' from source: magic vars 13131 1726867213.25687: variable 'omit' from source: magic vars 13131 1726867213.25716: variable 'omit' from source: magic vars 13131 1726867213.25748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867213.25774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867213.25791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867213.25811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.25819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.25842: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867213.25846: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.25848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.25918: Set connection var ansible_connection to ssh 13131 1726867213.25929: Set connection var ansible_timeout to 10 13131 1726867213.25932: Set connection var ansible_shell_type to sh 13131 1726867213.25938: Set connection var ansible_shell_executable to /bin/sh 13131 1726867213.25946: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867213.25951: Set connection var ansible_pipelining to False 13131 1726867213.25969: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.25972: variable 'ansible_connection' from source: unknown 13131 1726867213.25975: variable 'ansible_module_compression' from source: unknown 13131 1726867213.25979: variable 'ansible_shell_type' from source: unknown 13131 1726867213.25981: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.25984: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.25986: variable 'ansible_pipelining' from source: unknown 13131 1726867213.25988: variable 'ansible_timeout' from source: unknown 13131 1726867213.25993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.26143: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867213.26153: variable 'omit' from source: magic vars 13131 1726867213.26157: starting attempt loop 13131 1726867213.26159: running the handler 13131 1726867213.26169: _low_level_execute_command(): starting 13131 1726867213.26178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867213.26664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.26669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867213.26672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.26723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867213.26727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.26786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.28493: stdout chunk (state=3): >>>/root <<< 13131 1726867213.28603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.28626: stderr chunk (state=3): >>><<< 13131 1726867213.28630: stdout chunk (state=3): >>><<< 13131 1726867213.28649: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.28662: _low_level_execute_command(): starting 13131 1726867213.28668: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162 `" && echo ansible-tmp-1726867213.2864988-14567-80593613521162="` echo /root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162 `" ) && sleep 0' 13131 1726867213.29083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.29087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.29089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867213.29099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.29143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.29147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.29197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.31056: stdout chunk (state=3): >>>ansible-tmp-1726867213.2864988-14567-80593613521162=/root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162 <<< 13131 1726867213.31162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.31184: stderr chunk (state=3): >>><<< 13131 1726867213.31189: stdout chunk (state=3): >>><<< 13131 1726867213.31207: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867213.2864988-14567-80593613521162=/root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.31243: variable 'ansible_module_compression' from source: unknown 13131 1726867213.31278: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13131 1726867213.31315: variable 'ansible_facts' from source: unknown 13131 1726867213.31370: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/AnsiballZ_ping.py 13131 1726867213.31470: Sending initial data 13131 1726867213.31473: Sent initial data (152 bytes) 13131 1726867213.31886: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.31889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.31892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.31894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.31941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.31944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.31996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.33516: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 13131 1726867213.33521: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867213.33560: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867213.33607: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpfsic4n87 /root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/AnsiballZ_ping.py <<< 13131 1726867213.33611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/AnsiballZ_ping.py" <<< 13131 1726867213.33649: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpfsic4n87" to remote "/root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/AnsiballZ_ping.py" <<< 13131 1726867213.34283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.34321: stderr chunk (state=3): >>><<< 13131 1726867213.34324: stdout chunk (state=3): >>><<< 13131 1726867213.34362: done transferring module to remote 13131 1726867213.34371: _low_level_execute_command(): starting 13131 1726867213.34375: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/ /root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/AnsiballZ_ping.py && sleep 0' 13131 1726867213.34808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867213.34811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867213.34814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867213.34816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867213.34818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.34866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.34869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.34925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.36629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.36651: stderr chunk (state=3): >>><<< 13131 1726867213.36654: stdout chunk (state=3): >>><<< 13131 1726867213.36666: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.36669: _low_level_execute_command(): starting 13131 1726867213.36674: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/AnsiballZ_ping.py && sleep 0' 13131 1726867213.37070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.37073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867213.37106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867213.37109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.37111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.37113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.37162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.37165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.37221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.52017: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13131 1726867213.53294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867213.53312: stdout chunk (state=3): >>><<< 13131 1726867213.53325: stderr chunk (state=3): >>><<< 13131 1726867213.53352: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867213.53392: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867213.53411: _low_level_execute_command(): starting 13131 1726867213.53414: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867213.2864988-14567-80593613521162/ > /dev/null 2>&1 && sleep 0' 13131 1726867213.53830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.53833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867213.53836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.53838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.53840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.53883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867213.53900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.53963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.55795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.55821: stderr chunk (state=3): >>><<< 13131 1726867213.55826: stdout chunk (state=3): >>><<< 13131 1726867213.55840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.55845: handler run complete 13131 1726867213.55857: attempt loop complete, returning result 13131 1726867213.55859: _execute() done 13131 1726867213.55862: dumping result to json 13131 1726867213.55865: done dumping result, returning 13131 1726867213.55873: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-5f24-9b7a-000000000090] 13131 1726867213.55879: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000090 13131 1726867213.55968: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000090 13131 1726867213.55970: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 13131 1726867213.56045: no more pending results, returning what we have 13131 1726867213.56049: results queue empty 13131 1726867213.56050: checking for any_errors_fatal 13131 1726867213.56056: done checking for any_errors_fatal 13131 1726867213.56057: checking for max_fail_percentage 13131 1726867213.56059: done checking for max_fail_percentage 13131 1726867213.56059: checking to see if all hosts have failed and the running result is not ok 13131 1726867213.56060: done checking to see if all hosts have failed 13131 1726867213.56060: getting the remaining hosts for this loop 13131 1726867213.56062: done getting the remaining hosts for this loop 13131 1726867213.56065: getting the next task for host managed_node1 13131 1726867213.56074: done getting next task for host managed_node1 13131 1726867213.56076: ^ task is: TASK: meta (role_complete) 13131 1726867213.56081: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867213.56093: getting variables 13131 1726867213.56094: in VariableManager get_vars() 13131 1726867213.56146: Calling all_inventory to load vars for managed_node1 13131 1726867213.56148: Calling groups_inventory to load vars for managed_node1 13131 1726867213.56150: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867213.56159: Calling all_plugins_play to load vars for managed_node1 13131 1726867213.56161: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867213.56164: Calling groups_plugins_play to load vars for managed_node1 13131 1726867213.57313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867213.58423: done with get_vars() 13131 1726867213.58441: done getting variables 13131 1726867213.58506: done queuing things up, now waiting for results queue to drain 13131 1726867213.58508: results queue empty 13131 1726867213.58508: checking for any_errors_fatal 13131 1726867213.58510: done checking for any_errors_fatal 13131 1726867213.58511: checking for max_fail_percentage 13131 1726867213.58512: done checking for max_fail_percentage 13131 1726867213.58512: checking to see if all hosts have failed and the running result is not ok 13131 1726867213.58512: done checking to see if all hosts have failed 13131 1726867213.58513: getting the remaining hosts for this loop 13131 1726867213.58513: done getting the remaining hosts for this loop 13131 1726867213.58515: getting the next task for host managed_node1 13131 1726867213.58518: done getting next task for host managed_node1 13131 1726867213.58520: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 13131 1726867213.58521: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867213.58522: getting variables 13131 1726867213.58523: in VariableManager get_vars() 13131 1726867213.58537: Calling all_inventory to load vars for managed_node1 13131 1726867213.58538: Calling groups_inventory to load vars for managed_node1 13131 1726867213.58540: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867213.58543: Calling all_plugins_play to load vars for managed_node1 13131 1726867213.58544: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867213.58546: Calling groups_plugins_play to load vars for managed_node1 13131 1726867213.59265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867213.60659: done with get_vars() 13131 1726867213.60679: done getting variables 13131 1726867213.60724: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867213.60846: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Friday 20 September 2024 17:20:13 -0400 (0:00:00.364) 0:00:28.719 ****** 13131 1726867213.60874: entering _queue_task() for managed_node1/command 13131 1726867213.61255: worker is 1 (out of 1 available) 13131 1726867213.61268: exiting _queue_task() for managed_node1/command 13131 1726867213.61282: done queuing things up, now waiting for results queue to drain 13131 1726867213.61283: waiting for pending results... 13131 1726867213.61473: running TaskExecutor() for managed_node1/TASK: From the active connection, get the port1 profile "bond0.0" 13131 1726867213.61537: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000c0 13131 1726867213.61550: variable 'ansible_search_path' from source: unknown 13131 1726867213.61581: calling self._execute() 13131 1726867213.61669: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.61674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.61684: variable 'omit' from source: magic vars 13131 1726867213.61957: variable 'ansible_distribution_major_version' from source: facts 13131 1726867213.61974: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867213.62053: variable 'network_provider' from source: set_fact 13131 1726867213.62059: Evaluated conditional (network_provider == "nm"): True 13131 1726867213.62070: variable 'omit' from source: magic vars 13131 1726867213.62087: variable 'omit' from source: magic vars 13131 1726867213.62154: variable 'port1_profile' from source: play vars 13131 1726867213.62169: variable 'omit' from source: magic vars 13131 1726867213.62204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867213.62233: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867213.62250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867213.62263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.62274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.62301: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867213.62306: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.62310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.62374: Set connection var ansible_connection to ssh 13131 1726867213.62383: Set connection var ansible_timeout to 10 13131 1726867213.62387: Set connection var ansible_shell_type to sh 13131 1726867213.62396: Set connection var ansible_shell_executable to /bin/sh 13131 1726867213.62402: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867213.62409: Set connection var ansible_pipelining to False 13131 1726867213.62427: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.62430: variable 'ansible_connection' from source: unknown 13131 1726867213.62432: variable 'ansible_module_compression' from source: unknown 13131 1726867213.62434: variable 'ansible_shell_type' from source: unknown 13131 1726867213.62437: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.62439: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.62444: variable 'ansible_pipelining' from source: unknown 13131 1726867213.62447: variable 'ansible_timeout' from source: unknown 13131 1726867213.62450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.62552: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867213.62561: variable 'omit' from source: magic vars 13131 1726867213.62566: starting attempt loop 13131 1726867213.62569: running the handler 13131 1726867213.62583: _low_level_execute_command(): starting 13131 1726867213.62590: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867213.63300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.63308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.63310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867213.63313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.63351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.65052: stdout chunk (state=3): >>>/root <<< 13131 1726867213.65157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.65181: stderr chunk (state=3): >>><<< 13131 1726867213.65185: stdout chunk (state=3): >>><<< 13131 1726867213.65205: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.65220: _low_level_execute_command(): starting 13131 1726867213.65225: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179 `" && echo ansible-tmp-1726867213.6520689-14591-209847341982179="` echo /root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179 `" ) && sleep 0' 13131 1726867213.65674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867213.65693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.65727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.67608: stdout chunk (state=3): >>>ansible-tmp-1726867213.6520689-14591-209847341982179=/root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179 <<< 13131 1726867213.67711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.67734: stderr chunk (state=3): >>><<< 13131 1726867213.67737: stdout chunk (state=3): >>><<< 13131 1726867213.67751: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867213.6520689-14591-209847341982179=/root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.67775: variable 'ansible_module_compression' from source: unknown 13131 1726867213.67818: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867213.67855: variable 'ansible_facts' from source: unknown 13131 1726867213.67913: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/AnsiballZ_command.py 13131 1726867213.68014: Sending initial data 13131 1726867213.68018: Sent initial data (156 bytes) 13131 1726867213.68438: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867213.68441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867213.68443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.68447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867213.68449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.68451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.68501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.68509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.68575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.70115: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13131 1726867213.70118: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867213.70152: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867213.70198: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpp3lc3xtv /root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/AnsiballZ_command.py <<< 13131 1726867213.70201: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/AnsiballZ_command.py" <<< 13131 1726867213.70242: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpp3lc3xtv" to remote "/root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/AnsiballZ_command.py" <<< 13131 1726867213.70792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.70827: stderr chunk (state=3): >>><<< 13131 1726867213.70831: stdout chunk (state=3): >>><<< 13131 1726867213.70854: done transferring module to remote 13131 1726867213.70862: _low_level_execute_command(): starting 13131 1726867213.70875: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/ /root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/AnsiballZ_command.py && sleep 0' 13131 1726867213.71280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.71284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867213.71286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.71288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.71290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.71340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.71343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.71394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.73134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.73153: stderr chunk (state=3): >>><<< 13131 1726867213.73156: stdout chunk (state=3): >>><<< 13131 1726867213.73170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.73173: _low_level_execute_command(): starting 13131 1726867213.73175: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/AnsiballZ_command.py && sleep 0' 13131 1726867213.73594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867213.73597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867213.73599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867213.73602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867213.73604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.73651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.73655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.73709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.90540: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-20 17:20:13.886641", "end": "2024-09-20 17:20:13.903632", "delta": "0:00:00.016991", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867213.92188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867213.92191: stdout chunk (state=3): >>><<< 13131 1726867213.92193: stderr chunk (state=3): >>><<< 13131 1726867213.92196: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-20 17:20:13.886641", "end": "2024-09-20 17:20:13.903632", "delta": "0:00:00.016991", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867213.92198: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867213.92201: _low_level_execute_command(): starting 13131 1726867213.92203: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867213.6520689-14591-209847341982179/ > /dev/null 2>&1 && sleep 0' 13131 1726867213.92843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867213.92854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.92926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867213.94718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867213.94739: stderr chunk (state=3): >>><<< 13131 1726867213.94742: stdout chunk (state=3): >>><<< 13131 1726867213.94755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867213.94760: handler run complete 13131 1726867213.94785: Evaluated conditional (False): False 13131 1726867213.94793: attempt loop complete, returning result 13131 1726867213.94796: _execute() done 13131 1726867213.94798: dumping result to json 13131 1726867213.94886: done dumping result, returning 13131 1726867213.94889: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the port1 profile "bond0.0" [0affcac9-a3a5-5f24-9b7a-0000000000c0] 13131 1726867213.94891: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c0 13131 1726867213.94965: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c0 13131 1726867213.94968: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.0" ], "delta": "0:00:00.016991", "end": "2024-09-20 17:20:13.903632", "rc": 0, "start": "2024-09-20 17:20:13.886641" } 13131 1726867213.95063: no more pending results, returning what we have 13131 1726867213.95066: results queue empty 13131 1726867213.95067: checking for any_errors_fatal 13131 1726867213.95068: done checking for any_errors_fatal 13131 1726867213.95069: checking for max_fail_percentage 13131 1726867213.95070: done checking for max_fail_percentage 13131 1726867213.95071: checking to see if all hosts have failed and the running result is not ok 13131 1726867213.95072: done checking to see if all hosts have failed 13131 1726867213.95072: getting the remaining hosts for this loop 13131 1726867213.95074: done getting the remaining hosts for this loop 13131 1726867213.95079: getting the next task for host managed_node1 13131 1726867213.95084: done getting next task for host managed_node1 13131 1726867213.95086: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 13131 1726867213.95088: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867213.95092: getting variables 13131 1726867213.95093: in VariableManager get_vars() 13131 1726867213.95141: Calling all_inventory to load vars for managed_node1 13131 1726867213.95143: Calling groups_inventory to load vars for managed_node1 13131 1726867213.95145: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867213.95154: Calling all_plugins_play to load vars for managed_node1 13131 1726867213.95156: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867213.95159: Calling groups_plugins_play to load vars for managed_node1 13131 1726867213.96413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867213.97379: done with get_vars() 13131 1726867213.97394: done getting variables 13131 1726867213.97436: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867213.97521: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Friday 20 September 2024 17:20:13 -0400 (0:00:00.366) 0:00:29.086 ****** 13131 1726867213.97542: entering _queue_task() for managed_node1/command 13131 1726867213.97768: worker is 1 (out of 1 available) 13131 1726867213.97784: exiting _queue_task() for managed_node1/command 13131 1726867213.97797: done queuing things up, now waiting for results queue to drain 13131 1726867213.97799: waiting for pending results... 13131 1726867213.97974: running TaskExecutor() for managed_node1/TASK: From the active connection, get the port2 profile "bond0.1" 13131 1726867213.98044: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000c1 13131 1726867213.98057: variable 'ansible_search_path' from source: unknown 13131 1726867213.98087: calling self._execute() 13131 1726867213.98168: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.98172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.98184: variable 'omit' from source: magic vars 13131 1726867213.98462: variable 'ansible_distribution_major_version' from source: facts 13131 1726867213.98481: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867213.98557: variable 'network_provider' from source: set_fact 13131 1726867213.98561: Evaluated conditional (network_provider == "nm"): True 13131 1726867213.98569: variable 'omit' from source: magic vars 13131 1726867213.98590: variable 'omit' from source: magic vars 13131 1726867213.98654: variable 'port2_profile' from source: play vars 13131 1726867213.98668: variable 'omit' from source: magic vars 13131 1726867213.98707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867213.98732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867213.98747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867213.98760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.98770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867213.98799: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867213.98805: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.98807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.98873: Set connection var ansible_connection to ssh 13131 1726867213.98880: Set connection var ansible_timeout to 10 13131 1726867213.98883: Set connection var ansible_shell_type to sh 13131 1726867213.98892: Set connection var ansible_shell_executable to /bin/sh 13131 1726867213.98901: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867213.98911: Set connection var ansible_pipelining to False 13131 1726867213.98928: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.98931: variable 'ansible_connection' from source: unknown 13131 1726867213.98934: variable 'ansible_module_compression' from source: unknown 13131 1726867213.98936: variable 'ansible_shell_type' from source: unknown 13131 1726867213.98938: variable 'ansible_shell_executable' from source: unknown 13131 1726867213.98940: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867213.98943: variable 'ansible_pipelining' from source: unknown 13131 1726867213.98945: variable 'ansible_timeout' from source: unknown 13131 1726867213.98950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867213.99055: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867213.99063: variable 'omit' from source: magic vars 13131 1726867213.99069: starting attempt loop 13131 1726867213.99071: running the handler 13131 1726867213.99087: _low_level_execute_command(): starting 13131 1726867213.99094: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867213.99558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867213.99583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867213.99586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867213.99590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867213.99594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867213.99657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867213.99661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867213.99664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867213.99730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867214.01381: stdout chunk (state=3): >>>/root <<< 13131 1726867214.01482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867214.01508: stderr chunk (state=3): >>><<< 13131 1726867214.01512: stdout chunk (state=3): >>><<< 13131 1726867214.01531: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867214.01544: _low_level_execute_command(): starting 13131 1726867214.01549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153 `" && echo ansible-tmp-1726867214.015325-14607-107978354995153="` echo /root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153 `" ) && sleep 0' 13131 1726867214.01945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867214.01950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867214.01982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867214.01986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867214.01989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867214.02043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867214.02047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867214.02053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867214.02103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867214.04017: stdout chunk (state=3): >>>ansible-tmp-1726867214.015325-14607-107978354995153=/root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153 <<< 13131 1726867214.04174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867214.04180: stdout chunk (state=3): >>><<< 13131 1726867214.04183: stderr chunk (state=3): >>><<< 13131 1726867214.04199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867214.015325-14607-107978354995153=/root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867214.04283: variable 'ansible_module_compression' from source: unknown 13131 1726867214.04298: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867214.04340: variable 'ansible_facts' from source: unknown 13131 1726867214.04403: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/AnsiballZ_command.py 13131 1726867214.04497: Sending initial data 13131 1726867214.04500: Sent initial data (155 bytes) 13131 1726867214.04917: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867214.04921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867214.04923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867214.04925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867214.04927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867214.04929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867214.04974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867214.04982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867214.05024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867214.06566: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867214.06616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867214.06664: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp_u8rexrk /root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/AnsiballZ_command.py <<< 13131 1726867214.06680: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/AnsiballZ_command.py" <<< 13131 1726867214.06716: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13131 1726867214.06738: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp_u8rexrk" to remote "/root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/AnsiballZ_command.py" <<< 13131 1726867214.07690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867214.07829: stderr chunk (state=3): >>><<< 13131 1726867214.07832: stdout chunk (state=3): >>><<< 13131 1726867214.07835: done transferring module to remote 13131 1726867214.07837: _low_level_execute_command(): starting 13131 1726867214.07839: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/ /root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/AnsiballZ_command.py && sleep 0' 13131 1726867214.08367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867214.08383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867214.08398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867214.08514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867214.08546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867214.08559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867214.08636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867214.10594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867214.10597: stdout chunk (state=3): >>><<< 13131 1726867214.10600: stderr chunk (state=3): >>><<< 13131 1726867214.10604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867214.10606: _low_level_execute_command(): starting 13131 1726867214.10608: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/AnsiballZ_command.py && sleep 0' 13131 1726867214.11259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867214.11304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867214.11319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867214.28415: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-20 17:20:14.263614", "end": "2024-09-20 17:20:14.280398", "delta": "0:00:00.016784", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867214.29885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867214.29889: stdout chunk (state=3): >>><<< 13131 1726867214.29892: stderr chunk (state=3): >>><<< 13131 1726867214.29894: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-20 17:20:14.263614", "end": "2024-09-20 17:20:14.280398", "delta": "0:00:00.016784", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867214.29914: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867214.29957: _low_level_execute_command(): starting 13131 1726867214.29967: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867214.015325-14607-107978354995153/ > /dev/null 2>&1 && sleep 0' 13131 1726867214.31043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867214.31046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867214.31049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867214.31051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867214.31152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867214.31190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867214.33176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867214.33385: stdout chunk (state=3): >>><<< 13131 1726867214.33389: stderr chunk (state=3): >>><<< 13131 1726867214.33391: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867214.33394: handler run complete 13131 1726867214.33396: Evaluated conditional (False): False 13131 1726867214.33398: attempt loop complete, returning result 13131 1726867214.33403: _execute() done 13131 1726867214.33405: dumping result to json 13131 1726867214.33407: done dumping result, returning 13131 1726867214.33409: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the port2 profile "bond0.1" [0affcac9-a3a5-5f24-9b7a-0000000000c1] 13131 1726867214.33411: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c1 13131 1726867214.33692: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c1 13131 1726867214.33695: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.1" ], "delta": "0:00:00.016784", "end": "2024-09-20 17:20:14.280398", "rc": 0, "start": "2024-09-20 17:20:14.263614" } 13131 1726867214.33776: no more pending results, returning what we have 13131 1726867214.33782: results queue empty 13131 1726867214.33783: checking for any_errors_fatal 13131 1726867214.33794: done checking for any_errors_fatal 13131 1726867214.33795: checking for max_fail_percentage 13131 1726867214.33797: done checking for max_fail_percentage 13131 1726867214.33798: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.33799: done checking to see if all hosts have failed 13131 1726867214.33800: getting the remaining hosts for this loop 13131 1726867214.33804: done getting the remaining hosts for this loop 13131 1726867214.33808: getting the next task for host managed_node1 13131 1726867214.33815: done getting next task for host managed_node1 13131 1726867214.33817: ^ task is: TASK: Assert that the port1 profile is not activated 13131 1726867214.33820: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.33823: getting variables 13131 1726867214.33825: in VariableManager get_vars() 13131 1726867214.33987: Calling all_inventory to load vars for managed_node1 13131 1726867214.33991: Calling groups_inventory to load vars for managed_node1 13131 1726867214.33994: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.34008: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.34012: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.34015: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.36881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.39043: done with get_vars() 13131 1726867214.39071: done getting variables 13131 1726867214.39136: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Friday 20 September 2024 17:20:14 -0400 (0:00:00.416) 0:00:29.502 ****** 13131 1726867214.39171: entering _queue_task() for managed_node1/assert 13131 1726867214.39532: worker is 1 (out of 1 available) 13131 1726867214.39545: exiting _queue_task() for managed_node1/assert 13131 1726867214.39556: done queuing things up, now waiting for results queue to drain 13131 1726867214.39557: waiting for pending results... 13131 1726867214.39928: running TaskExecutor() for managed_node1/TASK: Assert that the port1 profile is not activated 13131 1726867214.40112: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000c2 13131 1726867214.40116: variable 'ansible_search_path' from source: unknown 13131 1726867214.40120: calling self._execute() 13131 1726867214.40599: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.40692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.40697: variable 'omit' from source: magic vars 13131 1726867214.41154: variable 'ansible_distribution_major_version' from source: facts 13131 1726867214.41206: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867214.41345: variable 'network_provider' from source: set_fact 13131 1726867214.41354: Evaluated conditional (network_provider == "nm"): True 13131 1726867214.41359: variable 'omit' from source: magic vars 13131 1726867214.41379: variable 'omit' from source: magic vars 13131 1726867214.41472: variable 'port1_profile' from source: play vars 13131 1726867214.41495: variable 'omit' from source: magic vars 13131 1726867214.41534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867214.41569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867214.41593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867214.41611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867214.41623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867214.41653: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867214.41656: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.41658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.41756: Set connection var ansible_connection to ssh 13131 1726867214.41764: Set connection var ansible_timeout to 10 13131 1726867214.41767: Set connection var ansible_shell_type to sh 13131 1726867214.41779: Set connection var ansible_shell_executable to /bin/sh 13131 1726867214.41786: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867214.41791: Set connection var ansible_pipelining to False 13131 1726867214.41818: variable 'ansible_shell_executable' from source: unknown 13131 1726867214.41822: variable 'ansible_connection' from source: unknown 13131 1726867214.41824: variable 'ansible_module_compression' from source: unknown 13131 1726867214.41827: variable 'ansible_shell_type' from source: unknown 13131 1726867214.41829: variable 'ansible_shell_executable' from source: unknown 13131 1726867214.41831: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.41833: variable 'ansible_pipelining' from source: unknown 13131 1726867214.41884: variable 'ansible_timeout' from source: unknown 13131 1726867214.41888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.41974: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867214.41988: variable 'omit' from source: magic vars 13131 1726867214.41994: starting attempt loop 13131 1726867214.41997: running the handler 13131 1726867214.42160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867214.45396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867214.45538: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867214.45542: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867214.45550: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867214.45579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867214.45648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867214.45681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867214.45707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867214.45748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867214.45765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867214.45875: variable 'active_port1_profile' from source: set_fact 13131 1726867214.45887: Evaluated conditional (active_port1_profile.stdout | length == 0): True 13131 1726867214.45893: handler run complete 13131 1726867214.45907: attempt loop complete, returning result 13131 1726867214.45910: _execute() done 13131 1726867214.45913: dumping result to json 13131 1726867214.45916: done dumping result, returning 13131 1726867214.45925: done running TaskExecutor() for managed_node1/TASK: Assert that the port1 profile is not activated [0affcac9-a3a5-5f24-9b7a-0000000000c2] 13131 1726867214.45928: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c2 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867214.46174: no more pending results, returning what we have 13131 1726867214.46179: results queue empty 13131 1726867214.46180: checking for any_errors_fatal 13131 1726867214.46186: done checking for any_errors_fatal 13131 1726867214.46187: checking for max_fail_percentage 13131 1726867214.46189: done checking for max_fail_percentage 13131 1726867214.46189: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.46190: done checking to see if all hosts have failed 13131 1726867214.46191: getting the remaining hosts for this loop 13131 1726867214.46192: done getting the remaining hosts for this loop 13131 1726867214.46195: getting the next task for host managed_node1 13131 1726867214.46202: done getting next task for host managed_node1 13131 1726867214.46204: ^ task is: TASK: Assert that the port2 profile is not activated 13131 1726867214.46206: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.46209: getting variables 13131 1726867214.46210: in VariableManager get_vars() 13131 1726867214.46253: Calling all_inventory to load vars for managed_node1 13131 1726867214.46260: Calling groups_inventory to load vars for managed_node1 13131 1726867214.46262: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.46271: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.46273: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.46276: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.46861: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c2 13131 1726867214.46897: WORKER PROCESS EXITING 13131 1726867214.48754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.50409: done with get_vars() 13131 1726867214.50430: done getting variables 13131 1726867214.50490: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Friday 20 September 2024 17:20:14 -0400 (0:00:00.113) 0:00:29.615 ****** 13131 1726867214.50523: entering _queue_task() for managed_node1/assert 13131 1726867214.50963: worker is 1 (out of 1 available) 13131 1726867214.50974: exiting _queue_task() for managed_node1/assert 13131 1726867214.50986: done queuing things up, now waiting for results queue to drain 13131 1726867214.50987: waiting for pending results... 13131 1726867214.51282: running TaskExecutor() for managed_node1/TASK: Assert that the port2 profile is not activated 13131 1726867214.51289: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000c3 13131 1726867214.51307: variable 'ansible_search_path' from source: unknown 13131 1726867214.51352: calling self._execute() 13131 1726867214.51471: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.51485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.51507: variable 'omit' from source: magic vars 13131 1726867214.52496: variable 'ansible_distribution_major_version' from source: facts 13131 1726867214.52500: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867214.52503: variable 'network_provider' from source: set_fact 13131 1726867214.52505: Evaluated conditional (network_provider == "nm"): True 13131 1726867214.52508: variable 'omit' from source: magic vars 13131 1726867214.52510: variable 'omit' from source: magic vars 13131 1726867214.52663: variable 'port2_profile' from source: play vars 13131 1726867214.52721: variable 'omit' from source: magic vars 13131 1726867214.52831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867214.52948: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867214.52968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867214.52991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867214.53009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867214.53054: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867214.53108: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.53116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.53282: Set connection var ansible_connection to ssh 13131 1726867214.53398: Set connection var ansible_timeout to 10 13131 1726867214.53422: Set connection var ansible_shell_type to sh 13131 1726867214.53479: Set connection var ansible_shell_executable to /bin/sh 13131 1726867214.53495: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867214.53504: Set connection var ansible_pipelining to False 13131 1726867214.53529: variable 'ansible_shell_executable' from source: unknown 13131 1726867214.53537: variable 'ansible_connection' from source: unknown 13131 1726867214.53544: variable 'ansible_module_compression' from source: unknown 13131 1726867214.53550: variable 'ansible_shell_type' from source: unknown 13131 1726867214.53556: variable 'ansible_shell_executable' from source: unknown 13131 1726867214.53562: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.53579: variable 'ansible_pipelining' from source: unknown 13131 1726867214.53589: variable 'ansible_timeout' from source: unknown 13131 1726867214.53598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.53789: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867214.53793: variable 'omit' from source: magic vars 13131 1726867214.53795: starting attempt loop 13131 1726867214.53798: running the handler 13131 1726867214.53944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867214.56131: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867214.56217: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867214.56260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867214.56305: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867214.56337: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867214.56420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867214.56459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867214.56491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867214.56536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867214.56555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867214.56653: variable 'active_port2_profile' from source: set_fact 13131 1726867214.56676: Evaluated conditional (active_port2_profile.stdout | length == 0): True 13131 1726867214.56689: handler run complete 13131 1726867214.56706: attempt loop complete, returning result 13131 1726867214.56827: _execute() done 13131 1726867214.56829: dumping result to json 13131 1726867214.56831: done dumping result, returning 13131 1726867214.56833: done running TaskExecutor() for managed_node1/TASK: Assert that the port2 profile is not activated [0affcac9-a3a5-5f24-9b7a-0000000000c3] 13131 1726867214.56836: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c3 13131 1726867214.56904: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c3 13131 1726867214.56907: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867214.56980: no more pending results, returning what we have 13131 1726867214.56984: results queue empty 13131 1726867214.56985: checking for any_errors_fatal 13131 1726867214.56993: done checking for any_errors_fatal 13131 1726867214.56993: checking for max_fail_percentage 13131 1726867214.56995: done checking for max_fail_percentage 13131 1726867214.56996: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.56997: done checking to see if all hosts have failed 13131 1726867214.56998: getting the remaining hosts for this loop 13131 1726867214.56999: done getting the remaining hosts for this loop 13131 1726867214.57003: getting the next task for host managed_node1 13131 1726867214.57009: done getting next task for host managed_node1 13131 1726867214.57011: ^ task is: TASK: Get the port1 device state 13131 1726867214.57013: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.57016: getting variables 13131 1726867214.57024: in VariableManager get_vars() 13131 1726867214.57081: Calling all_inventory to load vars for managed_node1 13131 1726867214.57084: Calling groups_inventory to load vars for managed_node1 13131 1726867214.57086: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.57098: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.57100: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.57103: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.58769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.60339: done with get_vars() 13131 1726867214.60363: done getting variables 13131 1726867214.60426: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Friday 20 September 2024 17:20:14 -0400 (0:00:00.099) 0:00:29.715 ****** 13131 1726867214.60456: entering _queue_task() for managed_node1/command 13131 1726867214.60811: worker is 1 (out of 1 available) 13131 1726867214.60822: exiting _queue_task() for managed_node1/command 13131 1726867214.60834: done queuing things up, now waiting for results queue to drain 13131 1726867214.60835: waiting for pending results... 13131 1726867214.61123: running TaskExecutor() for managed_node1/TASK: Get the port1 device state 13131 1726867214.61284: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000c4 13131 1726867214.61288: variable 'ansible_search_path' from source: unknown 13131 1726867214.61308: calling self._execute() 13131 1726867214.61425: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.61437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.61451: variable 'omit' from source: magic vars 13131 1726867214.61852: variable 'ansible_distribution_major_version' from source: facts 13131 1726867214.61870: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867214.61991: variable 'network_provider' from source: set_fact 13131 1726867214.62061: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867214.62065: when evaluation is False, skipping this task 13131 1726867214.62068: _execute() done 13131 1726867214.62071: dumping result to json 13131 1726867214.62073: done dumping result, returning 13131 1726867214.62075: done running TaskExecutor() for managed_node1/TASK: Get the port1 device state [0affcac9-a3a5-5f24-9b7a-0000000000c4] 13131 1726867214.62079: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c4 13131 1726867214.62146: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c4 13131 1726867214.62149: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867214.62220: no more pending results, returning what we have 13131 1726867214.62225: results queue empty 13131 1726867214.62226: checking for any_errors_fatal 13131 1726867214.62235: done checking for any_errors_fatal 13131 1726867214.62236: checking for max_fail_percentage 13131 1726867214.62238: done checking for max_fail_percentage 13131 1726867214.62239: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.62240: done checking to see if all hosts have failed 13131 1726867214.62240: getting the remaining hosts for this loop 13131 1726867214.62242: done getting the remaining hosts for this loop 13131 1726867214.62245: getting the next task for host managed_node1 13131 1726867214.62253: done getting next task for host managed_node1 13131 1726867214.62255: ^ task is: TASK: Get the port2 device state 13131 1726867214.62259: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.62263: getting variables 13131 1726867214.62265: in VariableManager get_vars() 13131 1726867214.62326: Calling all_inventory to load vars for managed_node1 13131 1726867214.62330: Calling groups_inventory to load vars for managed_node1 13131 1726867214.62333: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.62345: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.62348: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.62351: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.64079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.65533: done with get_vars() 13131 1726867214.65554: done getting variables 13131 1726867214.65612: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Friday 20 September 2024 17:20:14 -0400 (0:00:00.051) 0:00:29.766 ****** 13131 1726867214.65636: entering _queue_task() for managed_node1/command 13131 1726867214.65930: worker is 1 (out of 1 available) 13131 1726867214.65944: exiting _queue_task() for managed_node1/command 13131 1726867214.65955: done queuing things up, now waiting for results queue to drain 13131 1726867214.65956: waiting for pending results... 13131 1726867214.66306: running TaskExecutor() for managed_node1/TASK: Get the port2 device state 13131 1726867214.66328: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000c5 13131 1726867214.66348: variable 'ansible_search_path' from source: unknown 13131 1726867214.66389: calling self._execute() 13131 1726867214.66495: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.66582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.66585: variable 'omit' from source: magic vars 13131 1726867214.66893: variable 'ansible_distribution_major_version' from source: facts 13131 1726867214.66909: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867214.67027: variable 'network_provider' from source: set_fact 13131 1726867214.67038: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867214.67045: when evaluation is False, skipping this task 13131 1726867214.67055: _execute() done 13131 1726867214.67061: dumping result to json 13131 1726867214.67068: done dumping result, returning 13131 1726867214.67079: done running TaskExecutor() for managed_node1/TASK: Get the port2 device state [0affcac9-a3a5-5f24-9b7a-0000000000c5] 13131 1726867214.67087: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c5 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867214.67230: no more pending results, returning what we have 13131 1726867214.67234: results queue empty 13131 1726867214.67235: checking for any_errors_fatal 13131 1726867214.67242: done checking for any_errors_fatal 13131 1726867214.67243: checking for max_fail_percentage 13131 1726867214.67245: done checking for max_fail_percentage 13131 1726867214.67246: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.67247: done checking to see if all hosts have failed 13131 1726867214.67247: getting the remaining hosts for this loop 13131 1726867214.67249: done getting the remaining hosts for this loop 13131 1726867214.67252: getting the next task for host managed_node1 13131 1726867214.67259: done getting next task for host managed_node1 13131 1726867214.67261: ^ task is: TASK: Assert that the port1 device is in DOWN state 13131 1726867214.67264: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.67268: getting variables 13131 1726867214.67269: in VariableManager get_vars() 13131 1726867214.67328: Calling all_inventory to load vars for managed_node1 13131 1726867214.67331: Calling groups_inventory to load vars for managed_node1 13131 1726867214.67333: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.67345: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.67348: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.67351: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.68146: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c5 13131 1726867214.68150: WORKER PROCESS EXITING 13131 1726867214.68849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.70384: done with get_vars() 13131 1726867214.70404: done getting variables 13131 1726867214.70462: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Friday 20 September 2024 17:20:14 -0400 (0:00:00.048) 0:00:29.815 ****** 13131 1726867214.70490: entering _queue_task() for managed_node1/assert 13131 1726867214.71207: worker is 1 (out of 1 available) 13131 1726867214.71217: exiting _queue_task() for managed_node1/assert 13131 1726867214.71227: done queuing things up, now waiting for results queue to drain 13131 1726867214.71228: waiting for pending results... 13131 1726867214.71336: running TaskExecutor() for managed_node1/TASK: Assert that the port1 device is in DOWN state 13131 1726867214.71484: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000c6 13131 1726867214.71488: variable 'ansible_search_path' from source: unknown 13131 1726867214.71501: calling self._execute() 13131 1726867214.71607: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.71619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.71633: variable 'omit' from source: magic vars 13131 1726867214.72009: variable 'ansible_distribution_major_version' from source: facts 13131 1726867214.72027: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867214.72189: variable 'network_provider' from source: set_fact 13131 1726867214.72201: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867214.72210: when evaluation is False, skipping this task 13131 1726867214.72217: _execute() done 13131 1726867214.72224: dumping result to json 13131 1726867214.72231: done dumping result, returning 13131 1726867214.72241: done running TaskExecutor() for managed_node1/TASK: Assert that the port1 device is in DOWN state [0affcac9-a3a5-5f24-9b7a-0000000000c6] 13131 1726867214.72256: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c6 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867214.72638: no more pending results, returning what we have 13131 1726867214.72641: results queue empty 13131 1726867214.72642: checking for any_errors_fatal 13131 1726867214.72648: done checking for any_errors_fatal 13131 1726867214.72648: checking for max_fail_percentage 13131 1726867214.72650: done checking for max_fail_percentage 13131 1726867214.72650: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.72651: done checking to see if all hosts have failed 13131 1726867214.72652: getting the remaining hosts for this loop 13131 1726867214.72653: done getting the remaining hosts for this loop 13131 1726867214.72656: getting the next task for host managed_node1 13131 1726867214.72661: done getting next task for host managed_node1 13131 1726867214.72663: ^ task is: TASK: Assert that the port2 device is in DOWN state 13131 1726867214.72666: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.72669: getting variables 13131 1726867214.72670: in VariableManager get_vars() 13131 1726867214.72717: Calling all_inventory to load vars for managed_node1 13131 1726867214.72720: Calling groups_inventory to load vars for managed_node1 13131 1726867214.72722: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.72731: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.72733: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.72736: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.73290: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c6 13131 1726867214.73293: WORKER PROCESS EXITING 13131 1726867214.74145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.75930: done with get_vars() 13131 1726867214.75950: done getting variables 13131 1726867214.76007: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Friday 20 September 2024 17:20:14 -0400 (0:00:00.055) 0:00:29.870 ****** 13131 1726867214.76033: entering _queue_task() for managed_node1/assert 13131 1726867214.76304: worker is 1 (out of 1 available) 13131 1726867214.76316: exiting _queue_task() for managed_node1/assert 13131 1726867214.76327: done queuing things up, now waiting for results queue to drain 13131 1726867214.76328: waiting for pending results... 13131 1726867214.76592: running TaskExecutor() for managed_node1/TASK: Assert that the port2 device is in DOWN state 13131 1726867214.76699: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000c7 13131 1726867214.76720: variable 'ansible_search_path' from source: unknown 13131 1726867214.76760: calling self._execute() 13131 1726867214.76868: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.76883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.76896: variable 'omit' from source: magic vars 13131 1726867214.77247: variable 'ansible_distribution_major_version' from source: facts 13131 1726867214.77382: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867214.77385: variable 'network_provider' from source: set_fact 13131 1726867214.77387: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867214.77389: when evaluation is False, skipping this task 13131 1726867214.77392: _execute() done 13131 1726867214.77400: dumping result to json 13131 1726867214.77407: done dumping result, returning 13131 1726867214.77416: done running TaskExecutor() for managed_node1/TASK: Assert that the port2 device is in DOWN state [0affcac9-a3a5-5f24-9b7a-0000000000c7] 13131 1726867214.77425: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c7 13131 1726867214.77683: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000c7 13131 1726867214.77687: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867214.77724: no more pending results, returning what we have 13131 1726867214.77727: results queue empty 13131 1726867214.77728: checking for any_errors_fatal 13131 1726867214.77733: done checking for any_errors_fatal 13131 1726867214.77734: checking for max_fail_percentage 13131 1726867214.77736: done checking for max_fail_percentage 13131 1726867214.77736: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.77737: done checking to see if all hosts have failed 13131 1726867214.77738: getting the remaining hosts for this loop 13131 1726867214.77739: done getting the remaining hosts for this loop 13131 1726867214.77742: getting the next task for host managed_node1 13131 1726867214.77749: done getting next task for host managed_node1 13131 1726867214.77753: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867214.77756: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.77775: getting variables 13131 1726867214.77776: in VariableManager get_vars() 13131 1726867214.77823: Calling all_inventory to load vars for managed_node1 13131 1726867214.77825: Calling groups_inventory to load vars for managed_node1 13131 1726867214.77827: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.77836: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.77839: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.77842: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.79172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.80832: done with get_vars() 13131 1726867214.80850: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:20:14 -0400 (0:00:00.049) 0:00:29.920 ****** 13131 1726867214.80946: entering _queue_task() for managed_node1/include_tasks 13131 1726867214.81242: worker is 1 (out of 1 available) 13131 1726867214.81257: exiting _queue_task() for managed_node1/include_tasks 13131 1726867214.81269: done queuing things up, now waiting for results queue to drain 13131 1726867214.81271: waiting for pending results... 13131 1726867214.81559: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867214.81725: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000cf 13131 1726867214.81745: variable 'ansible_search_path' from source: unknown 13131 1726867214.81754: variable 'ansible_search_path' from source: unknown 13131 1726867214.81799: calling self._execute() 13131 1726867214.81901: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.81918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.81938: variable 'omit' from source: magic vars 13131 1726867214.82309: variable 'ansible_distribution_major_version' from source: facts 13131 1726867214.82327: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867214.82338: _execute() done 13131 1726867214.82346: dumping result to json 13131 1726867214.82354: done dumping result, returning 13131 1726867214.82371: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-5f24-9b7a-0000000000cf] 13131 1726867214.82470: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000cf 13131 1726867214.82544: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000cf 13131 1726867214.82548: WORKER PROCESS EXITING 13131 1726867214.82594: no more pending results, returning what we have 13131 1726867214.82599: in VariableManager get_vars() 13131 1726867214.82660: Calling all_inventory to load vars for managed_node1 13131 1726867214.82663: Calling groups_inventory to load vars for managed_node1 13131 1726867214.82666: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.82681: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.82684: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.82688: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.84120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.85638: done with get_vars() 13131 1726867214.85657: variable 'ansible_search_path' from source: unknown 13131 1726867214.85658: variable 'ansible_search_path' from source: unknown 13131 1726867214.85697: we have included files to process 13131 1726867214.85699: generating all_blocks data 13131 1726867214.85701: done generating all_blocks data 13131 1726867214.85706: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867214.85707: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867214.85709: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867214.86268: done processing included file 13131 1726867214.86271: iterating over new_blocks loaded from include file 13131 1726867214.86272: in VariableManager get_vars() 13131 1726867214.86306: done with get_vars() 13131 1726867214.86308: filtering new block on tags 13131 1726867214.86326: done filtering new block on tags 13131 1726867214.86328: in VariableManager get_vars() 13131 1726867214.86358: done with get_vars() 13131 1726867214.86360: filtering new block on tags 13131 1726867214.86382: done filtering new block on tags 13131 1726867214.86385: in VariableManager get_vars() 13131 1726867214.86414: done with get_vars() 13131 1726867214.86416: filtering new block on tags 13131 1726867214.86433: done filtering new block on tags 13131 1726867214.86436: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 13131 1726867214.86441: extending task lists for all hosts with included blocks 13131 1726867214.87250: done extending task lists 13131 1726867214.87252: done processing included files 13131 1726867214.87253: results queue empty 13131 1726867214.87253: checking for any_errors_fatal 13131 1726867214.87256: done checking for any_errors_fatal 13131 1726867214.87257: checking for max_fail_percentage 13131 1726867214.87258: done checking for max_fail_percentage 13131 1726867214.87259: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.87259: done checking to see if all hosts have failed 13131 1726867214.87260: getting the remaining hosts for this loop 13131 1726867214.87261: done getting the remaining hosts for this loop 13131 1726867214.87264: getting the next task for host managed_node1 13131 1726867214.87268: done getting next task for host managed_node1 13131 1726867214.87271: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867214.87274: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.87288: getting variables 13131 1726867214.87289: in VariableManager get_vars() 13131 1726867214.87308: Calling all_inventory to load vars for managed_node1 13131 1726867214.87311: Calling groups_inventory to load vars for managed_node1 13131 1726867214.87313: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.87318: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.87321: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.87324: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.88496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.89964: done with get_vars() 13131 1726867214.89984: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:20:14 -0400 (0:00:00.091) 0:00:30.011 ****** 13131 1726867214.90054: entering _queue_task() for managed_node1/setup 13131 1726867214.90355: worker is 1 (out of 1 available) 13131 1726867214.90367: exiting _queue_task() for managed_node1/setup 13131 1726867214.90482: done queuing things up, now waiting for results queue to drain 13131 1726867214.90484: waiting for pending results... 13131 1726867214.90793: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867214.90814: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000796 13131 1726867214.90835: variable 'ansible_search_path' from source: unknown 13131 1726867214.90842: variable 'ansible_search_path' from source: unknown 13131 1726867214.90884: calling self._execute() 13131 1726867214.90981: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.90994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.91010: variable 'omit' from source: magic vars 13131 1726867214.91368: variable 'ansible_distribution_major_version' from source: facts 13131 1726867214.91453: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867214.91598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867214.93666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867214.93749: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867214.93791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867214.93830: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867214.93862: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867214.94184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867214.94188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867214.94190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867214.94193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867214.94195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867214.94197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867214.94199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867214.94201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867214.94203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867214.94218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867214.94373: variable '__network_required_facts' from source: role '' defaults 13131 1726867214.94388: variable 'ansible_facts' from source: unknown 13131 1726867214.95112: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13131 1726867214.95121: when evaluation is False, skipping this task 13131 1726867214.95127: _execute() done 13131 1726867214.95134: dumping result to json 13131 1726867214.95141: done dumping result, returning 13131 1726867214.95151: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-5f24-9b7a-000000000796] 13131 1726867214.95159: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000796 13131 1726867214.95254: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000796 13131 1726867214.95261: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867214.95328: no more pending results, returning what we have 13131 1726867214.95331: results queue empty 13131 1726867214.95333: checking for any_errors_fatal 13131 1726867214.95334: done checking for any_errors_fatal 13131 1726867214.95335: checking for max_fail_percentage 13131 1726867214.95337: done checking for max_fail_percentage 13131 1726867214.95338: checking to see if all hosts have failed and the running result is not ok 13131 1726867214.95338: done checking to see if all hosts have failed 13131 1726867214.95339: getting the remaining hosts for this loop 13131 1726867214.95341: done getting the remaining hosts for this loop 13131 1726867214.95344: getting the next task for host managed_node1 13131 1726867214.95353: done getting next task for host managed_node1 13131 1726867214.95357: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867214.95361: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867214.95383: getting variables 13131 1726867214.95385: in VariableManager get_vars() 13131 1726867214.95438: Calling all_inventory to load vars for managed_node1 13131 1726867214.95441: Calling groups_inventory to load vars for managed_node1 13131 1726867214.95443: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867214.95453: Calling all_plugins_play to load vars for managed_node1 13131 1726867214.95456: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867214.95459: Calling groups_plugins_play to load vars for managed_node1 13131 1726867214.96997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867214.98512: done with get_vars() 13131 1726867214.98533: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:20:14 -0400 (0:00:00.085) 0:00:30.096 ****** 13131 1726867214.98632: entering _queue_task() for managed_node1/stat 13131 1726867214.98924: worker is 1 (out of 1 available) 13131 1726867214.98936: exiting _queue_task() for managed_node1/stat 13131 1726867214.98948: done queuing things up, now waiting for results queue to drain 13131 1726867214.98950: waiting for pending results... 13131 1726867214.99220: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867214.99374: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000798 13131 1726867214.99402: variable 'ansible_search_path' from source: unknown 13131 1726867214.99410: variable 'ansible_search_path' from source: unknown 13131 1726867214.99448: calling self._execute() 13131 1726867214.99546: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867214.99558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867214.99570: variable 'omit' from source: magic vars 13131 1726867214.99932: variable 'ansible_distribution_major_version' from source: facts 13131 1726867215.00048: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867215.00115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867215.00385: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867215.00432: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867215.00469: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867215.00512: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867215.00596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867215.00625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867215.00654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867215.00685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867215.00774: variable '__network_is_ostree' from source: set_fact 13131 1726867215.00788: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867215.00795: when evaluation is False, skipping this task 13131 1726867215.00805: _execute() done 13131 1726867215.00813: dumping result to json 13131 1726867215.00820: done dumping result, returning 13131 1726867215.00983: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-5f24-9b7a-000000000798] 13131 1726867215.00986: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000798 13131 1726867215.01049: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000798 13131 1726867215.01052: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867215.01104: no more pending results, returning what we have 13131 1726867215.01108: results queue empty 13131 1726867215.01109: checking for any_errors_fatal 13131 1726867215.01117: done checking for any_errors_fatal 13131 1726867215.01118: checking for max_fail_percentage 13131 1726867215.01120: done checking for max_fail_percentage 13131 1726867215.01121: checking to see if all hosts have failed and the running result is not ok 13131 1726867215.01122: done checking to see if all hosts have failed 13131 1726867215.01122: getting the remaining hosts for this loop 13131 1726867215.01124: done getting the remaining hosts for this loop 13131 1726867215.01127: getting the next task for host managed_node1 13131 1726867215.01134: done getting next task for host managed_node1 13131 1726867215.01138: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867215.01142: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867215.01164: getting variables 13131 1726867215.01166: in VariableManager get_vars() 13131 1726867215.01222: Calling all_inventory to load vars for managed_node1 13131 1726867215.01225: Calling groups_inventory to load vars for managed_node1 13131 1726867215.01227: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867215.01236: Calling all_plugins_play to load vars for managed_node1 13131 1726867215.01238: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867215.01241: Calling groups_plugins_play to load vars for managed_node1 13131 1726867215.02772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867215.04291: done with get_vars() 13131 1726867215.04310: done getting variables 13131 1726867215.04364: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:20:15 -0400 (0:00:00.057) 0:00:30.154 ****** 13131 1726867215.04401: entering _queue_task() for managed_node1/set_fact 13131 1726867215.04678: worker is 1 (out of 1 available) 13131 1726867215.04690: exiting _queue_task() for managed_node1/set_fact 13131 1726867215.04700: done queuing things up, now waiting for results queue to drain 13131 1726867215.04701: waiting for pending results... 13131 1726867215.05093: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867215.05113: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000799 13131 1726867215.05132: variable 'ansible_search_path' from source: unknown 13131 1726867215.05139: variable 'ansible_search_path' from source: unknown 13131 1726867215.05174: calling self._execute() 13131 1726867215.05272: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867215.05286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867215.05304: variable 'omit' from source: magic vars 13131 1726867215.05650: variable 'ansible_distribution_major_version' from source: facts 13131 1726867215.05666: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867215.05836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867215.06105: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867215.06151: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867215.06194: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867215.06233: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867215.06320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867215.06384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867215.06387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867215.06414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867215.06509: variable '__network_is_ostree' from source: set_fact 13131 1726867215.06521: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867215.06529: when evaluation is False, skipping this task 13131 1726867215.06600: _execute() done 13131 1726867215.06603: dumping result to json 13131 1726867215.06605: done dumping result, returning 13131 1726867215.06608: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-5f24-9b7a-000000000799] 13131 1726867215.06610: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000799 13131 1726867215.06662: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000799 13131 1726867215.06664: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867215.06753: no more pending results, returning what we have 13131 1726867215.06757: results queue empty 13131 1726867215.06759: checking for any_errors_fatal 13131 1726867215.06765: done checking for any_errors_fatal 13131 1726867215.06766: checking for max_fail_percentage 13131 1726867215.06768: done checking for max_fail_percentage 13131 1726867215.06769: checking to see if all hosts have failed and the running result is not ok 13131 1726867215.06770: done checking to see if all hosts have failed 13131 1726867215.06771: getting the remaining hosts for this loop 13131 1726867215.06772: done getting the remaining hosts for this loop 13131 1726867215.06776: getting the next task for host managed_node1 13131 1726867215.06789: done getting next task for host managed_node1 13131 1726867215.06793: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867215.06797: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867215.06820: getting variables 13131 1726867215.06822: in VariableManager get_vars() 13131 1726867215.06983: Calling all_inventory to load vars for managed_node1 13131 1726867215.06986: Calling groups_inventory to load vars for managed_node1 13131 1726867215.06989: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867215.06999: Calling all_plugins_play to load vars for managed_node1 13131 1726867215.07002: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867215.07005: Calling groups_plugins_play to load vars for managed_node1 13131 1726867215.08347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867215.09949: done with get_vars() 13131 1726867215.09969: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:20:15 -0400 (0:00:00.056) 0:00:30.211 ****** 13131 1726867215.10065: entering _queue_task() for managed_node1/service_facts 13131 1726867215.10333: worker is 1 (out of 1 available) 13131 1726867215.10346: exiting _queue_task() for managed_node1/service_facts 13131 1726867215.10357: done queuing things up, now waiting for results queue to drain 13131 1726867215.10358: waiting for pending results... 13131 1726867215.10792: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867215.10797: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000079b 13131 1726867215.10800: variable 'ansible_search_path' from source: unknown 13131 1726867215.10809: variable 'ansible_search_path' from source: unknown 13131 1726867215.10849: calling self._execute() 13131 1726867215.10944: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867215.10957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867215.10971: variable 'omit' from source: magic vars 13131 1726867215.11325: variable 'ansible_distribution_major_version' from source: facts 13131 1726867215.11341: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867215.11359: variable 'omit' from source: magic vars 13131 1726867215.11436: variable 'omit' from source: magic vars 13131 1726867215.11482: variable 'omit' from source: magic vars 13131 1726867215.11524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867215.11565: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867215.11595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867215.11617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867215.11683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867215.11686: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867215.11689: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867215.11691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867215.11783: Set connection var ansible_connection to ssh 13131 1726867215.11803: Set connection var ansible_timeout to 10 13131 1726867215.11811: Set connection var ansible_shell_type to sh 13131 1726867215.11824: Set connection var ansible_shell_executable to /bin/sh 13131 1726867215.11839: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867215.11849: Set connection var ansible_pipelining to False 13131 1726867215.11875: variable 'ansible_shell_executable' from source: unknown 13131 1726867215.11901: variable 'ansible_connection' from source: unknown 13131 1726867215.11904: variable 'ansible_module_compression' from source: unknown 13131 1726867215.11906: variable 'ansible_shell_type' from source: unknown 13131 1726867215.11909: variable 'ansible_shell_executable' from source: unknown 13131 1726867215.11910: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867215.12009: variable 'ansible_pipelining' from source: unknown 13131 1726867215.12013: variable 'ansible_timeout' from source: unknown 13131 1726867215.12015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867215.12133: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867215.12150: variable 'omit' from source: magic vars 13131 1726867215.12160: starting attempt loop 13131 1726867215.12168: running the handler 13131 1726867215.12188: _low_level_execute_command(): starting 13131 1726867215.12200: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867215.12906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867215.13000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867215.13072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867215.13121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867215.14785: stdout chunk (state=3): >>>/root <<< 13131 1726867215.14916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867215.14919: stderr chunk (state=3): >>><<< 13131 1726867215.14924: stdout chunk (state=3): >>><<< 13131 1726867215.15147: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867215.15150: _low_level_execute_command(): starting 13131 1726867215.15154: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978 `" && echo ansible-tmp-1726867215.149426-14655-6413313251978="` echo /root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978 `" ) && sleep 0' 13131 1726867215.15623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867215.15629: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867215.15654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867215.15725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867215.15731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867215.15750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867215.15828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867215.17709: stdout chunk (state=3): >>>ansible-tmp-1726867215.149426-14655-6413313251978=/root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978 <<< 13131 1726867215.17985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867215.17989: stderr chunk (state=3): >>><<< 13131 1726867215.17991: stdout chunk (state=3): >>><<< 13131 1726867215.17994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867215.149426-14655-6413313251978=/root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867215.17996: variable 'ansible_module_compression' from source: unknown 13131 1726867215.17998: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13131 1726867215.18003: variable 'ansible_facts' from source: unknown 13131 1726867215.18089: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/AnsiballZ_service_facts.py 13131 1726867215.18294: Sending initial data 13131 1726867215.18298: Sent initial data (159 bytes) 13131 1726867215.18786: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867215.18796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867215.18862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867215.18894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867215.18915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867215.18926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867215.18943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867215.19023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867215.20615: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13131 1726867215.20644: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867215.20687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867215.20742: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmptlgbp2x9 /root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/AnsiballZ_service_facts.py <<< 13131 1726867215.20746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/AnsiballZ_service_facts.py" <<< 13131 1726867215.20791: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmptlgbp2x9" to remote "/root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/AnsiballZ_service_facts.py" <<< 13131 1726867215.21659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867215.21663: stderr chunk (state=3): >>><<< 13131 1726867215.21665: stdout chunk (state=3): >>><<< 13131 1726867215.21674: done transferring module to remote 13131 1726867215.21692: _low_level_execute_command(): starting 13131 1726867215.21704: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/ /root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/AnsiballZ_service_facts.py && sleep 0' 13131 1726867215.22349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867215.22362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867215.22381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867215.22429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867215.22447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867215.22532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867215.22558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867215.22582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867215.22660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867215.24497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867215.24518: stdout chunk (state=3): >>><<< 13131 1726867215.24529: stderr chunk (state=3): >>><<< 13131 1726867215.24624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867215.24627: _low_level_execute_command(): starting 13131 1726867215.24630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/AnsiballZ_service_facts.py && sleep 0' 13131 1726867215.25105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867215.25128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867215.25140: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867215.25190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867215.25207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867215.25254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867216.80138: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 13131 1726867216.80171: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 13131 1726867216.80187: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 13131 1726867216.80200: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 13131 1726867216.80204: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13131 1726867216.81797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867216.81801: stdout chunk (state=3): >>><<< 13131 1726867216.81803: stderr chunk (state=3): >>><<< 13131 1726867216.81990: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867216.82385: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867216.82393: _low_level_execute_command(): starting 13131 1726867216.82397: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867215.149426-14655-6413313251978/ > /dev/null 2>&1 && sleep 0' 13131 1726867216.82865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867216.82869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867216.82871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867216.82944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867216.82950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867216.82996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867216.84986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867216.84989: stdout chunk (state=3): >>><<< 13131 1726867216.84992: stderr chunk (state=3): >>><<< 13131 1726867216.84994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867216.84996: handler run complete 13131 1726867216.85033: variable 'ansible_facts' from source: unknown 13131 1726867216.85188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867216.85688: variable 'ansible_facts' from source: unknown 13131 1726867216.85831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867216.86151: attempt loop complete, returning result 13131 1726867216.86482: _execute() done 13131 1726867216.86485: dumping result to json 13131 1726867216.86487: done dumping result, returning 13131 1726867216.86489: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-5f24-9b7a-00000000079b] 13131 1726867216.86491: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000079b ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867216.87851: no more pending results, returning what we have 13131 1726867216.87854: results queue empty 13131 1726867216.87855: checking for any_errors_fatal 13131 1726867216.87859: done checking for any_errors_fatal 13131 1726867216.87859: checking for max_fail_percentage 13131 1726867216.87861: done checking for max_fail_percentage 13131 1726867216.87862: checking to see if all hosts have failed and the running result is not ok 13131 1726867216.87862: done checking to see if all hosts have failed 13131 1726867216.87863: getting the remaining hosts for this loop 13131 1726867216.87865: done getting the remaining hosts for this loop 13131 1726867216.87868: getting the next task for host managed_node1 13131 1726867216.87873: done getting next task for host managed_node1 13131 1726867216.87876: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867216.87882: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867216.87895: getting variables 13131 1726867216.87897: in VariableManager get_vars() 13131 1726867216.88051: Calling all_inventory to load vars for managed_node1 13131 1726867216.88054: Calling groups_inventory to load vars for managed_node1 13131 1726867216.88056: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867216.88066: Calling all_plugins_play to load vars for managed_node1 13131 1726867216.88068: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867216.88071: Calling groups_plugins_play to load vars for managed_node1 13131 1726867216.88590: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000079b 13131 1726867216.88594: WORKER PROCESS EXITING 13131 1726867216.89467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867216.95300: done with get_vars() 13131 1726867216.95327: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:20:16 -0400 (0:00:01.853) 0:00:32.064 ****** 13131 1726867216.95408: entering _queue_task() for managed_node1/package_facts 13131 1726867216.95756: worker is 1 (out of 1 available) 13131 1726867216.95769: exiting _queue_task() for managed_node1/package_facts 13131 1726867216.95985: done queuing things up, now waiting for results queue to drain 13131 1726867216.95987: waiting for pending results... 13131 1726867216.96074: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867216.96326: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000079c 13131 1726867216.96331: variable 'ansible_search_path' from source: unknown 13131 1726867216.96335: variable 'ansible_search_path' from source: unknown 13131 1726867216.96343: calling self._execute() 13131 1726867216.96444: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867216.96459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867216.96471: variable 'omit' from source: magic vars 13131 1726867216.96841: variable 'ansible_distribution_major_version' from source: facts 13131 1726867216.96867: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867216.96978: variable 'omit' from source: magic vars 13131 1726867216.96982: variable 'omit' from source: magic vars 13131 1726867216.96984: variable 'omit' from source: magic vars 13131 1726867216.97026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867216.97066: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867216.97093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867216.97116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867216.97133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867216.97168: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867216.97180: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867216.97194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867216.97298: Set connection var ansible_connection to ssh 13131 1726867216.97317: Set connection var ansible_timeout to 10 13131 1726867216.97325: Set connection var ansible_shell_type to sh 13131 1726867216.97338: Set connection var ansible_shell_executable to /bin/sh 13131 1726867216.97352: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867216.97360: Set connection var ansible_pipelining to False 13131 1726867216.97387: variable 'ansible_shell_executable' from source: unknown 13131 1726867216.97395: variable 'ansible_connection' from source: unknown 13131 1726867216.97405: variable 'ansible_module_compression' from source: unknown 13131 1726867216.97418: variable 'ansible_shell_type' from source: unknown 13131 1726867216.97525: variable 'ansible_shell_executable' from source: unknown 13131 1726867216.97528: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867216.97530: variable 'ansible_pipelining' from source: unknown 13131 1726867216.97532: variable 'ansible_timeout' from source: unknown 13131 1726867216.97534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867216.97655: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867216.97673: variable 'omit' from source: magic vars 13131 1726867216.97686: starting attempt loop 13131 1726867216.97693: running the handler 13131 1726867216.97715: _low_level_execute_command(): starting 13131 1726867216.97728: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867216.98493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867216.98521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867216.98540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867216.98634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867216.98674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867216.98695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867216.98735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867216.98816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867217.00496: stdout chunk (state=3): >>>/root <<< 13131 1726867217.00638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867217.00648: stdout chunk (state=3): >>><<< 13131 1726867217.00659: stderr chunk (state=3): >>><<< 13131 1726867217.00773: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867217.00780: _low_level_execute_command(): starting 13131 1726867217.00784: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871 `" && echo ansible-tmp-1726867217.0068502-14714-165164958912871="` echo /root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871 `" ) && sleep 0' 13131 1726867217.01346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867217.01350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867217.01353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867217.01382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867217.01385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867217.01492: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867217.01496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867217.01519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867217.01543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867217.01560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867217.01643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867217.03547: stdout chunk (state=3): >>>ansible-tmp-1726867217.0068502-14714-165164958912871=/root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871 <<< 13131 1726867217.03711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867217.03715: stdout chunk (state=3): >>><<< 13131 1726867217.03718: stderr chunk (state=3): >>><<< 13131 1726867217.03836: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867217.0068502-14714-165164958912871=/root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867217.03840: variable 'ansible_module_compression' from source: unknown 13131 1726867217.03866: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13131 1726867217.03927: variable 'ansible_facts' from source: unknown 13131 1726867217.04105: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/AnsiballZ_package_facts.py 13131 1726867217.04382: Sending initial data 13131 1726867217.04385: Sent initial data (162 bytes) 13131 1726867217.04939: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867217.04954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867217.04996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867217.05062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867217.05110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867217.05167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867217.06719: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867217.06763: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867217.06809: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpq2a39hw_ /root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/AnsiballZ_package_facts.py <<< 13131 1726867217.06815: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/AnsiballZ_package_facts.py" <<< 13131 1726867217.06856: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpq2a39hw_" to remote "/root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/AnsiballZ_package_facts.py" <<< 13131 1726867217.07926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867217.07985: stderr chunk (state=3): >>><<< 13131 1726867217.07989: stdout chunk (state=3): >>><<< 13131 1726867217.07991: done transferring module to remote 13131 1726867217.07993: _low_level_execute_command(): starting 13131 1726867217.07996: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/ /root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/AnsiballZ_package_facts.py && sleep 0' 13131 1726867217.08499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867217.08738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867217.08742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867217.08749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867217.09203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867217.10609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867217.10652: stderr chunk (state=3): >>><<< 13131 1726867217.10655: stdout chunk (state=3): >>><<< 13131 1726867217.10670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867217.10673: _low_level_execute_command(): starting 13131 1726867217.10883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/AnsiballZ_package_facts.py && sleep 0' 13131 1726867217.11229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867217.11238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867217.11249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867217.11262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867217.11279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867217.11291: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867217.11300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867217.11314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867217.11325: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867217.11389: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867217.11412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867217.11423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867217.11440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867217.11508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867217.55917: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 13131 1726867217.56051: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 13131 1726867217.56147: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13131 1726867217.57981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867217.57986: stdout chunk (state=3): >>><<< 13131 1726867217.57989: stderr chunk (state=3): >>><<< 13131 1726867217.58188: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867217.60224: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867217.60249: _low_level_execute_command(): starting 13131 1726867217.60258: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867217.0068502-14714-165164958912871/ > /dev/null 2>&1 && sleep 0' 13131 1726867217.60886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867217.60903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867217.60924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867217.60949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867217.61052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867217.61071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867217.61093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867217.61171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867217.63609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867217.63619: stdout chunk (state=3): >>><<< 13131 1726867217.63634: stderr chunk (state=3): >>><<< 13131 1726867217.63653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867217.63783: handler run complete 13131 1726867217.64628: variable 'ansible_facts' from source: unknown 13131 1726867217.65124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867217.67254: variable 'ansible_facts' from source: unknown 13131 1726867217.67721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867217.68443: attempt loop complete, returning result 13131 1726867217.68466: _execute() done 13131 1726867217.68475: dumping result to json 13131 1726867217.68696: done dumping result, returning 13131 1726867217.68713: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-5f24-9b7a-00000000079c] 13131 1726867217.68722: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000079c 13131 1726867217.71070: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000079c 13131 1726867217.71073: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867217.71228: no more pending results, returning what we have 13131 1726867217.71231: results queue empty 13131 1726867217.71232: checking for any_errors_fatal 13131 1726867217.71242: done checking for any_errors_fatal 13131 1726867217.71243: checking for max_fail_percentage 13131 1726867217.71244: done checking for max_fail_percentage 13131 1726867217.71245: checking to see if all hosts have failed and the running result is not ok 13131 1726867217.71247: done checking to see if all hosts have failed 13131 1726867217.71247: getting the remaining hosts for this loop 13131 1726867217.71249: done getting the remaining hosts for this loop 13131 1726867217.71252: getting the next task for host managed_node1 13131 1726867217.71258: done getting next task for host managed_node1 13131 1726867217.71262: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867217.71265: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867217.71278: getting variables 13131 1726867217.71279: in VariableManager get_vars() 13131 1726867217.71321: Calling all_inventory to load vars for managed_node1 13131 1726867217.71324: Calling groups_inventory to load vars for managed_node1 13131 1726867217.71326: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867217.71335: Calling all_plugins_play to load vars for managed_node1 13131 1726867217.71337: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867217.71340: Calling groups_plugins_play to load vars for managed_node1 13131 1726867217.72670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867217.74320: done with get_vars() 13131 1726867217.74340: done getting variables 13131 1726867217.74407: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:20:17 -0400 (0:00:00.790) 0:00:32.854 ****** 13131 1726867217.74443: entering _queue_task() for managed_node1/debug 13131 1726867217.74811: worker is 1 (out of 1 available) 13131 1726867217.74827: exiting _queue_task() for managed_node1/debug 13131 1726867217.74836: done queuing things up, now waiting for results queue to drain 13131 1726867217.74838: waiting for pending results... 13131 1726867217.75051: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867217.75200: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d0 13131 1726867217.75226: variable 'ansible_search_path' from source: unknown 13131 1726867217.75234: variable 'ansible_search_path' from source: unknown 13131 1726867217.75285: calling self._execute() 13131 1726867217.75476: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867217.75482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867217.75485: variable 'omit' from source: magic vars 13131 1726867217.75824: variable 'ansible_distribution_major_version' from source: facts 13131 1726867217.75841: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867217.75852: variable 'omit' from source: magic vars 13131 1726867217.75922: variable 'omit' from source: magic vars 13131 1726867217.76032: variable 'network_provider' from source: set_fact 13131 1726867217.76057: variable 'omit' from source: magic vars 13131 1726867217.76106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867217.76152: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867217.76175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867217.76200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867217.76220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867217.76282: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867217.76286: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867217.76289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867217.76382: Set connection var ansible_connection to ssh 13131 1726867217.76396: Set connection var ansible_timeout to 10 13131 1726867217.76456: Set connection var ansible_shell_type to sh 13131 1726867217.76460: Set connection var ansible_shell_executable to /bin/sh 13131 1726867217.76462: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867217.76464: Set connection var ansible_pipelining to False 13131 1726867217.76468: variable 'ansible_shell_executable' from source: unknown 13131 1726867217.76476: variable 'ansible_connection' from source: unknown 13131 1726867217.76486: variable 'ansible_module_compression' from source: unknown 13131 1726867217.76492: variable 'ansible_shell_type' from source: unknown 13131 1726867217.76499: variable 'ansible_shell_executable' from source: unknown 13131 1726867217.76509: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867217.76517: variable 'ansible_pipelining' from source: unknown 13131 1726867217.76524: variable 'ansible_timeout' from source: unknown 13131 1726867217.76532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867217.76682: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867217.76785: variable 'omit' from source: magic vars 13131 1726867217.76788: starting attempt loop 13131 1726867217.76791: running the handler 13131 1726867217.76793: handler run complete 13131 1726867217.76795: attempt loop complete, returning result 13131 1726867217.76797: _execute() done 13131 1726867217.76800: dumping result to json 13131 1726867217.76804: done dumping result, returning 13131 1726867217.76807: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-5f24-9b7a-0000000000d0] 13131 1726867217.76813: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d0 ok: [managed_node1] => {} MSG: Using network provider: nm 13131 1726867217.77044: no more pending results, returning what we have 13131 1726867217.77048: results queue empty 13131 1726867217.77049: checking for any_errors_fatal 13131 1726867217.77059: done checking for any_errors_fatal 13131 1726867217.77060: checking for max_fail_percentage 13131 1726867217.77062: done checking for max_fail_percentage 13131 1726867217.77063: checking to see if all hosts have failed and the running result is not ok 13131 1726867217.77063: done checking to see if all hosts have failed 13131 1726867217.77064: getting the remaining hosts for this loop 13131 1726867217.77066: done getting the remaining hosts for this loop 13131 1726867217.77068: getting the next task for host managed_node1 13131 1726867217.77075: done getting next task for host managed_node1 13131 1726867217.77085: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867217.77088: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867217.77287: getting variables 13131 1726867217.77289: in VariableManager get_vars() 13131 1726867217.77334: Calling all_inventory to load vars for managed_node1 13131 1726867217.77337: Calling groups_inventory to load vars for managed_node1 13131 1726867217.77339: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867217.77347: Calling all_plugins_play to load vars for managed_node1 13131 1726867217.77350: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867217.77353: Calling groups_plugins_play to load vars for managed_node1 13131 1726867217.77893: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d0 13131 1726867217.77897: WORKER PROCESS EXITING 13131 1726867217.78748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867217.80312: done with get_vars() 13131 1726867217.80336: done getting variables 13131 1726867217.80395: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:20:17 -0400 (0:00:00.059) 0:00:32.914 ****** 13131 1726867217.80438: entering _queue_task() for managed_node1/fail 13131 1726867217.80893: worker is 1 (out of 1 available) 13131 1726867217.80907: exiting _queue_task() for managed_node1/fail 13131 1726867217.80918: done queuing things up, now waiting for results queue to drain 13131 1726867217.80919: waiting for pending results... 13131 1726867217.81125: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867217.81315: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d1 13131 1726867217.81339: variable 'ansible_search_path' from source: unknown 13131 1726867217.81349: variable 'ansible_search_path' from source: unknown 13131 1726867217.81406: calling self._execute() 13131 1726867217.81524: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867217.81538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867217.81554: variable 'omit' from source: magic vars 13131 1726867217.81973: variable 'ansible_distribution_major_version' from source: facts 13131 1726867217.81991: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867217.82135: variable 'network_state' from source: role '' defaults 13131 1726867217.82150: Evaluated conditional (network_state != {}): False 13131 1726867217.82163: when evaluation is False, skipping this task 13131 1726867217.82171: _execute() done 13131 1726867217.82180: dumping result to json 13131 1726867217.82188: done dumping result, returning 13131 1726867217.82200: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-5f24-9b7a-0000000000d1] 13131 1726867217.82213: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d1 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867217.82429: no more pending results, returning what we have 13131 1726867217.82433: results queue empty 13131 1726867217.82434: checking for any_errors_fatal 13131 1726867217.82444: done checking for any_errors_fatal 13131 1726867217.82445: checking for max_fail_percentage 13131 1726867217.82447: done checking for max_fail_percentage 13131 1726867217.82448: checking to see if all hosts have failed and the running result is not ok 13131 1726867217.82449: done checking to see if all hosts have failed 13131 1726867217.82450: getting the remaining hosts for this loop 13131 1726867217.82451: done getting the remaining hosts for this loop 13131 1726867217.82454: getting the next task for host managed_node1 13131 1726867217.82461: done getting next task for host managed_node1 13131 1726867217.82465: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867217.82469: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867217.82594: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d1 13131 1726867217.82599: WORKER PROCESS EXITING 13131 1726867217.82618: getting variables 13131 1726867217.82620: in VariableManager get_vars() 13131 1726867217.82671: Calling all_inventory to load vars for managed_node1 13131 1726867217.82674: Calling groups_inventory to load vars for managed_node1 13131 1726867217.82676: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867217.82789: Calling all_plugins_play to load vars for managed_node1 13131 1726867217.82792: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867217.82795: Calling groups_plugins_play to load vars for managed_node1 13131 1726867217.84347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867217.86126: done with get_vars() 13131 1726867217.86144: done getting variables 13131 1726867217.86203: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:20:17 -0400 (0:00:00.057) 0:00:32.972 ****** 13131 1726867217.86231: entering _queue_task() for managed_node1/fail 13131 1726867217.86554: worker is 1 (out of 1 available) 13131 1726867217.86568: exiting _queue_task() for managed_node1/fail 13131 1726867217.86588: done queuing things up, now waiting for results queue to drain 13131 1726867217.86589: waiting for pending results... 13131 1726867217.86868: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867217.87029: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d2 13131 1726867217.87050: variable 'ansible_search_path' from source: unknown 13131 1726867217.87059: variable 'ansible_search_path' from source: unknown 13131 1726867217.87111: calling self._execute() 13131 1726867217.87227: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867217.87240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867217.87268: variable 'omit' from source: magic vars 13131 1726867217.88139: variable 'ansible_distribution_major_version' from source: facts 13131 1726867217.88155: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867217.88483: variable 'network_state' from source: role '' defaults 13131 1726867217.88486: Evaluated conditional (network_state != {}): False 13131 1726867217.88489: when evaluation is False, skipping this task 13131 1726867217.88491: _execute() done 13131 1726867217.88494: dumping result to json 13131 1726867217.88496: done dumping result, returning 13131 1726867217.88499: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-5f24-9b7a-0000000000d2] 13131 1726867217.88504: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d2 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867217.88619: no more pending results, returning what we have 13131 1726867217.88623: results queue empty 13131 1726867217.88624: checking for any_errors_fatal 13131 1726867217.88631: done checking for any_errors_fatal 13131 1726867217.88632: checking for max_fail_percentage 13131 1726867217.88634: done checking for max_fail_percentage 13131 1726867217.88635: checking to see if all hosts have failed and the running result is not ok 13131 1726867217.88636: done checking to see if all hosts have failed 13131 1726867217.88637: getting the remaining hosts for this loop 13131 1726867217.88638: done getting the remaining hosts for this loop 13131 1726867217.88641: getting the next task for host managed_node1 13131 1726867217.88649: done getting next task for host managed_node1 13131 1726867217.88654: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867217.88658: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867217.88681: getting variables 13131 1726867217.88682: in VariableManager get_vars() 13131 1726867217.88732: Calling all_inventory to load vars for managed_node1 13131 1726867217.88734: Calling groups_inventory to load vars for managed_node1 13131 1726867217.88736: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867217.88747: Calling all_plugins_play to load vars for managed_node1 13131 1726867217.88750: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867217.88753: Calling groups_plugins_play to load vars for managed_node1 13131 1726867217.89387: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d2 13131 1726867217.89392: WORKER PROCESS EXITING 13131 1726867217.90696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867217.93152: done with get_vars() 13131 1726867217.93174: done getting variables 13131 1726867217.93352: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:20:17 -0400 (0:00:00.071) 0:00:33.044 ****** 13131 1726867217.93420: entering _queue_task() for managed_node1/fail 13131 1726867217.94289: worker is 1 (out of 1 available) 13131 1726867217.94299: exiting _queue_task() for managed_node1/fail 13131 1726867217.94308: done queuing things up, now waiting for results queue to drain 13131 1726867217.94309: waiting for pending results... 13131 1726867217.94671: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867217.94996: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d3 13131 1726867217.95011: variable 'ansible_search_path' from source: unknown 13131 1726867217.95015: variable 'ansible_search_path' from source: unknown 13131 1726867217.95050: calling self._execute() 13131 1726867217.95147: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867217.95154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867217.95162: variable 'omit' from source: magic vars 13131 1726867217.95938: variable 'ansible_distribution_major_version' from source: facts 13131 1726867217.95949: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867217.96323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867218.01386: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867218.01391: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867218.01394: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867218.01397: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867218.01399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867218.01455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.01487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.01532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.01552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.01566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.01820: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.01823: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13131 1726867218.01826: variable 'ansible_distribution' from source: facts 13131 1726867218.01829: variable '__network_rh_distros' from source: role '' defaults 13131 1726867218.01831: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13131 1726867218.02100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.02132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.02158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.02199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.02215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.02265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.02289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.02315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.02357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.02369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.02582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.02586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.02589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.02591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.02593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.02836: variable 'network_connections' from source: task vars 13131 1726867218.02848: variable 'controller_profile' from source: play vars 13131 1726867218.02917: variable 'controller_profile' from source: play vars 13131 1726867218.02927: variable 'controller_device' from source: play vars 13131 1726867218.03125: variable 'controller_device' from source: play vars 13131 1726867218.03128: variable 'port1_profile' from source: play vars 13131 1726867218.03130: variable 'port1_profile' from source: play vars 13131 1726867218.03133: variable 'dhcp_interface1' from source: play vars 13131 1726867218.03135: variable 'dhcp_interface1' from source: play vars 13131 1726867218.03137: variable 'controller_profile' from source: play vars 13131 1726867218.03190: variable 'controller_profile' from source: play vars 13131 1726867218.03197: variable 'port2_profile' from source: play vars 13131 1726867218.03261: variable 'port2_profile' from source: play vars 13131 1726867218.03267: variable 'dhcp_interface2' from source: play vars 13131 1726867218.03336: variable 'dhcp_interface2' from source: play vars 13131 1726867218.03342: variable 'controller_profile' from source: play vars 13131 1726867218.03400: variable 'controller_profile' from source: play vars 13131 1726867218.03411: variable 'network_state' from source: role '' defaults 13131 1726867218.03481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867218.03651: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867218.03689: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867218.03722: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867218.03749: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867218.03796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867218.03821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867218.03845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.03874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867218.04084: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13131 1726867218.04087: when evaluation is False, skipping this task 13131 1726867218.04089: _execute() done 13131 1726867218.04091: dumping result to json 13131 1726867218.04092: done dumping result, returning 13131 1726867218.04094: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-5f24-9b7a-0000000000d3] 13131 1726867218.04096: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d3 13131 1726867218.04157: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d3 13131 1726867218.04160: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13131 1726867218.04227: no more pending results, returning what we have 13131 1726867218.04230: results queue empty 13131 1726867218.04231: checking for any_errors_fatal 13131 1726867218.04236: done checking for any_errors_fatal 13131 1726867218.04237: checking for max_fail_percentage 13131 1726867218.04238: done checking for max_fail_percentage 13131 1726867218.04239: checking to see if all hosts have failed and the running result is not ok 13131 1726867218.04239: done checking to see if all hosts have failed 13131 1726867218.04240: getting the remaining hosts for this loop 13131 1726867218.04241: done getting the remaining hosts for this loop 13131 1726867218.04245: getting the next task for host managed_node1 13131 1726867218.04250: done getting next task for host managed_node1 13131 1726867218.04254: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867218.04256: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867218.04272: getting variables 13131 1726867218.04274: in VariableManager get_vars() 13131 1726867218.04320: Calling all_inventory to load vars for managed_node1 13131 1726867218.04323: Calling groups_inventory to load vars for managed_node1 13131 1726867218.04325: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867218.04333: Calling all_plugins_play to load vars for managed_node1 13131 1726867218.04335: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867218.04338: Calling groups_plugins_play to load vars for managed_node1 13131 1726867218.07119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867218.10275: done with get_vars() 13131 1726867218.10304: done getting variables 13131 1726867218.10367: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:20:18 -0400 (0:00:00.171) 0:00:33.216 ****** 13131 1726867218.10604: entering _queue_task() for managed_node1/dnf 13131 1726867218.11159: worker is 1 (out of 1 available) 13131 1726867218.11172: exiting _queue_task() for managed_node1/dnf 13131 1726867218.11486: done queuing things up, now waiting for results queue to drain 13131 1726867218.11488: waiting for pending results... 13131 1726867218.11867: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867218.12013: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d4 13131 1726867218.12035: variable 'ansible_search_path' from source: unknown 13131 1726867218.12048: variable 'ansible_search_path' from source: unknown 13131 1726867218.12092: calling self._execute() 13131 1726867218.12261: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.12265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.12268: variable 'omit' from source: magic vars 13131 1726867218.12612: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.12630: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867218.12833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867218.15076: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867218.15149: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867218.15196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867218.15269: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867218.15272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867218.15347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.15399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.15428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.15471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.15583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.15614: variable 'ansible_distribution' from source: facts 13131 1726867218.15624: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.15643: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13131 1726867218.15763: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867218.15900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.15933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.15960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.16003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.16025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.16067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.16096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.16123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.16168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.16187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.16228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.16258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.16288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.16348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.16351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.16504: variable 'network_connections' from source: task vars 13131 1726867218.16522: variable 'controller_profile' from source: play vars 13131 1726867218.16673: variable 'controller_profile' from source: play vars 13131 1726867218.16678: variable 'controller_device' from source: play vars 13131 1726867218.16680: variable 'controller_device' from source: play vars 13131 1726867218.16683: variable 'port1_profile' from source: play vars 13131 1726867218.16738: variable 'port1_profile' from source: play vars 13131 1726867218.16751: variable 'dhcp_interface1' from source: play vars 13131 1726867218.16818: variable 'dhcp_interface1' from source: play vars 13131 1726867218.16830: variable 'controller_profile' from source: play vars 13131 1726867218.16892: variable 'controller_profile' from source: play vars 13131 1726867218.16908: variable 'port2_profile' from source: play vars 13131 1726867218.16967: variable 'port2_profile' from source: play vars 13131 1726867218.16983: variable 'dhcp_interface2' from source: play vars 13131 1726867218.17045: variable 'dhcp_interface2' from source: play vars 13131 1726867218.17057: variable 'controller_profile' from source: play vars 13131 1726867218.17282: variable 'controller_profile' from source: play vars 13131 1726867218.17294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867218.17362: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867218.17404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867218.17437: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867218.17468: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867218.17516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867218.17556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867218.17590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.17619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867218.17688: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867218.18282: variable 'network_connections' from source: task vars 13131 1726867218.18285: variable 'controller_profile' from source: play vars 13131 1726867218.18287: variable 'controller_profile' from source: play vars 13131 1726867218.18289: variable 'controller_device' from source: play vars 13131 1726867218.18290: variable 'controller_device' from source: play vars 13131 1726867218.18292: variable 'port1_profile' from source: play vars 13131 1726867218.18417: variable 'port1_profile' from source: play vars 13131 1726867218.18624: variable 'dhcp_interface1' from source: play vars 13131 1726867218.18627: variable 'dhcp_interface1' from source: play vars 13131 1726867218.18630: variable 'controller_profile' from source: play vars 13131 1726867218.18712: variable 'controller_profile' from source: play vars 13131 1726867218.18765: variable 'port2_profile' from source: play vars 13131 1726867218.18902: variable 'port2_profile' from source: play vars 13131 1726867218.18914: variable 'dhcp_interface2' from source: play vars 13131 1726867218.19013: variable 'dhcp_interface2' from source: play vars 13131 1726867218.19090: variable 'controller_profile' from source: play vars 13131 1726867218.19339: variable 'controller_profile' from source: play vars 13131 1726867218.19449: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867218.19452: when evaluation is False, skipping this task 13131 1726867218.19454: _execute() done 13131 1726867218.19457: dumping result to json 13131 1726867218.19459: done dumping result, returning 13131 1726867218.19461: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-0000000000d4] 13131 1726867218.19464: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d4 13131 1726867218.19542: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d4 13131 1726867218.19545: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867218.19606: no more pending results, returning what we have 13131 1726867218.19610: results queue empty 13131 1726867218.19612: checking for any_errors_fatal 13131 1726867218.19617: done checking for any_errors_fatal 13131 1726867218.19618: checking for max_fail_percentage 13131 1726867218.19620: done checking for max_fail_percentage 13131 1726867218.19621: checking to see if all hosts have failed and the running result is not ok 13131 1726867218.19621: done checking to see if all hosts have failed 13131 1726867218.19622: getting the remaining hosts for this loop 13131 1726867218.19624: done getting the remaining hosts for this loop 13131 1726867218.19627: getting the next task for host managed_node1 13131 1726867218.19636: done getting next task for host managed_node1 13131 1726867218.19640: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867218.19643: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867218.19666: getting variables 13131 1726867218.19668: in VariableManager get_vars() 13131 1726867218.19728: Calling all_inventory to load vars for managed_node1 13131 1726867218.19731: Calling groups_inventory to load vars for managed_node1 13131 1726867218.19734: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867218.19745: Calling all_plugins_play to load vars for managed_node1 13131 1726867218.19749: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867218.19752: Calling groups_plugins_play to load vars for managed_node1 13131 1726867218.22518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867218.24348: done with get_vars() 13131 1726867218.24368: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867218.24447: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:20:18 -0400 (0:00:00.138) 0:00:33.355 ****** 13131 1726867218.24480: entering _queue_task() for managed_node1/yum 13131 1726867218.24890: worker is 1 (out of 1 available) 13131 1726867218.24901: exiting _queue_task() for managed_node1/yum 13131 1726867218.24911: done queuing things up, now waiting for results queue to drain 13131 1726867218.24912: waiting for pending results... 13131 1726867218.25088: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867218.25321: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d5 13131 1726867218.25341: variable 'ansible_search_path' from source: unknown 13131 1726867218.25349: variable 'ansible_search_path' from source: unknown 13131 1726867218.25396: calling self._execute() 13131 1726867218.25792: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.25796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.25798: variable 'omit' from source: magic vars 13131 1726867218.26409: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.26431: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867218.26788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867218.29436: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867218.29571: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867218.29680: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867218.29769: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867218.29827: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867218.30190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.30194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.30196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.30198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.30206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.30309: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.30330: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13131 1726867218.30338: when evaluation is False, skipping this task 13131 1726867218.30345: _execute() done 13131 1726867218.30352: dumping result to json 13131 1726867218.30358: done dumping result, returning 13131 1726867218.30370: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-0000000000d5] 13131 1726867218.30379: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d5 13131 1726867218.30495: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13131 1726867218.30553: no more pending results, returning what we have 13131 1726867218.30556: results queue empty 13131 1726867218.30557: checking for any_errors_fatal 13131 1726867218.30564: done checking for any_errors_fatal 13131 1726867218.30564: checking for max_fail_percentage 13131 1726867218.30566: done checking for max_fail_percentage 13131 1726867218.30567: checking to see if all hosts have failed and the running result is not ok 13131 1726867218.30568: done checking to see if all hosts have failed 13131 1726867218.30569: getting the remaining hosts for this loop 13131 1726867218.30570: done getting the remaining hosts for this loop 13131 1726867218.30574: getting the next task for host managed_node1 13131 1726867218.30583: done getting next task for host managed_node1 13131 1726867218.30587: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867218.30590: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867218.30611: getting variables 13131 1726867218.30613: in VariableManager get_vars() 13131 1726867218.30667: Calling all_inventory to load vars for managed_node1 13131 1726867218.30670: Calling groups_inventory to load vars for managed_node1 13131 1726867218.30673: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867218.30890: Calling all_plugins_play to load vars for managed_node1 13131 1726867218.30894: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867218.30898: Calling groups_plugins_play to load vars for managed_node1 13131 1726867218.31591: WORKER PROCESS EXITING 13131 1726867218.32388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867218.34210: done with get_vars() 13131 1726867218.34234: done getting variables 13131 1726867218.34595: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:20:18 -0400 (0:00:00.101) 0:00:33.456 ****** 13131 1726867218.34628: entering _queue_task() for managed_node1/fail 13131 1726867218.35096: worker is 1 (out of 1 available) 13131 1726867218.35109: exiting _queue_task() for managed_node1/fail 13131 1726867218.35385: done queuing things up, now waiting for results queue to drain 13131 1726867218.35387: waiting for pending results... 13131 1726867218.35656: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867218.35998: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d6 13131 1726867218.36095: variable 'ansible_search_path' from source: unknown 13131 1726867218.36099: variable 'ansible_search_path' from source: unknown 13131 1726867218.36102: calling self._execute() 13131 1726867218.36258: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.36271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.36289: variable 'omit' from source: magic vars 13131 1726867218.36648: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.36659: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867218.36738: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867218.36868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867218.38860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867218.38864: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867218.38892: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867218.38935: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867218.38969: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867218.39055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.39495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.39560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.39625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.39650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.39688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.39704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.39722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.39759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.39770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.39800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.39818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.39916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.39919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.39922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.40186: variable 'network_connections' from source: task vars 13131 1726867218.40190: variable 'controller_profile' from source: play vars 13131 1726867218.40192: variable 'controller_profile' from source: play vars 13131 1726867218.40194: variable 'controller_device' from source: play vars 13131 1726867218.40243: variable 'controller_device' from source: play vars 13131 1726867218.40258: variable 'port1_profile' from source: play vars 13131 1726867218.40327: variable 'port1_profile' from source: play vars 13131 1726867218.40342: variable 'dhcp_interface1' from source: play vars 13131 1726867218.40415: variable 'dhcp_interface1' from source: play vars 13131 1726867218.40430: variable 'controller_profile' from source: play vars 13131 1726867218.40497: variable 'controller_profile' from source: play vars 13131 1726867218.40518: variable 'port2_profile' from source: play vars 13131 1726867218.40583: variable 'port2_profile' from source: play vars 13131 1726867218.40597: variable 'dhcp_interface2' from source: play vars 13131 1726867218.40669: variable 'dhcp_interface2' from source: play vars 13131 1726867218.40686: variable 'controller_profile' from source: play vars 13131 1726867218.40756: variable 'controller_profile' from source: play vars 13131 1726867218.40840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867218.41023: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867218.41072: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867218.41110: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867218.41144: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867218.41203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867218.41232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867218.41275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.41386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867218.41399: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867218.41633: variable 'network_connections' from source: task vars 13131 1726867218.41643: variable 'controller_profile' from source: play vars 13131 1726867218.41716: variable 'controller_profile' from source: play vars 13131 1726867218.41729: variable 'controller_device' from source: play vars 13131 1726867218.41832: variable 'controller_device' from source: play vars 13131 1726867218.41883: variable 'port1_profile' from source: play vars 13131 1726867218.42283: variable 'port1_profile' from source: play vars 13131 1726867218.42286: variable 'dhcp_interface1' from source: play vars 13131 1726867218.42289: variable 'dhcp_interface1' from source: play vars 13131 1726867218.42291: variable 'controller_profile' from source: play vars 13131 1726867218.42385: variable 'controller_profile' from source: play vars 13131 1726867218.42492: variable 'port2_profile' from source: play vars 13131 1726867218.42563: variable 'port2_profile' from source: play vars 13131 1726867218.42637: variable 'dhcp_interface2' from source: play vars 13131 1726867218.42754: variable 'dhcp_interface2' from source: play vars 13131 1726867218.42768: variable 'controller_profile' from source: play vars 13131 1726867218.42914: variable 'controller_profile' from source: play vars 13131 1726867218.43033: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867218.43041: when evaluation is False, skipping this task 13131 1726867218.43047: _execute() done 13131 1726867218.43056: dumping result to json 13131 1726867218.43072: done dumping result, returning 13131 1726867218.43088: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-0000000000d6] 13131 1726867218.43098: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d6 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867218.43374: no more pending results, returning what we have 13131 1726867218.43385: results queue empty 13131 1726867218.43392: checking for any_errors_fatal 13131 1726867218.43398: done checking for any_errors_fatal 13131 1726867218.43398: checking for max_fail_percentage 13131 1726867218.43401: done checking for max_fail_percentage 13131 1726867218.43402: checking to see if all hosts have failed and the running result is not ok 13131 1726867218.43403: done checking to see if all hosts have failed 13131 1726867218.43404: getting the remaining hosts for this loop 13131 1726867218.43405: done getting the remaining hosts for this loop 13131 1726867218.43409: getting the next task for host managed_node1 13131 1726867218.43416: done getting next task for host managed_node1 13131 1726867218.43420: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13131 1726867218.43423: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867218.43444: getting variables 13131 1726867218.43446: in VariableManager get_vars() 13131 1726867218.43595: Calling all_inventory to load vars for managed_node1 13131 1726867218.43599: Calling groups_inventory to load vars for managed_node1 13131 1726867218.43601: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867218.43611: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d6 13131 1726867218.43614: WORKER PROCESS EXITING 13131 1726867218.43623: Calling all_plugins_play to load vars for managed_node1 13131 1726867218.43626: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867218.43629: Calling groups_plugins_play to load vars for managed_node1 13131 1726867218.45406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867218.46481: done with get_vars() 13131 1726867218.46501: done getting variables 13131 1726867218.46557: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:20:18 -0400 (0:00:00.119) 0:00:33.576 ****** 13131 1726867218.46597: entering _queue_task() for managed_node1/package 13131 1726867218.46866: worker is 1 (out of 1 available) 13131 1726867218.46881: exiting _queue_task() for managed_node1/package 13131 1726867218.46891: done queuing things up, now waiting for results queue to drain 13131 1726867218.46893: waiting for pending results... 13131 1726867218.47096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13131 1726867218.47217: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d7 13131 1726867218.47237: variable 'ansible_search_path' from source: unknown 13131 1726867218.47240: variable 'ansible_search_path' from source: unknown 13131 1726867218.47272: calling self._execute() 13131 1726867218.47373: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.47382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.47387: variable 'omit' from source: magic vars 13131 1726867218.47762: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.47773: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867218.48032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867218.48209: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867218.48255: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867218.48271: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867218.48324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867218.48403: variable 'network_packages' from source: role '' defaults 13131 1726867218.48478: variable '__network_provider_setup' from source: role '' defaults 13131 1726867218.48487: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867218.48533: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867218.48541: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867218.48588: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867218.48702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867218.50072: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867218.50282: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867218.50286: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867218.50288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867218.50291: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867218.50309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.50334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.50359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.50401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.50512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.50518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.50521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.50523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.50540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.50557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.50787: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867218.50900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.50938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.50955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.50992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.50998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.51058: variable 'ansible_python' from source: facts 13131 1726867218.51080: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867218.51135: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867218.51194: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867218.51285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.51308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.51325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.51349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.51359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.51392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.51417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.51433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.51457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.51467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.51565: variable 'network_connections' from source: task vars 13131 1726867218.51571: variable 'controller_profile' from source: play vars 13131 1726867218.51644: variable 'controller_profile' from source: play vars 13131 1726867218.51653: variable 'controller_device' from source: play vars 13131 1726867218.51724: variable 'controller_device' from source: play vars 13131 1726867218.51737: variable 'port1_profile' from source: play vars 13131 1726867218.51801: variable 'port1_profile' from source: play vars 13131 1726867218.51811: variable 'dhcp_interface1' from source: play vars 13131 1726867218.51882: variable 'dhcp_interface1' from source: play vars 13131 1726867218.51889: variable 'controller_profile' from source: play vars 13131 1726867218.51959: variable 'controller_profile' from source: play vars 13131 1726867218.51966: variable 'port2_profile' from source: play vars 13131 1726867218.52035: variable 'port2_profile' from source: play vars 13131 1726867218.52043: variable 'dhcp_interface2' from source: play vars 13131 1726867218.52116: variable 'dhcp_interface2' from source: play vars 13131 1726867218.52123: variable 'controller_profile' from source: play vars 13131 1726867218.52192: variable 'controller_profile' from source: play vars 13131 1726867218.52242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867218.52261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867218.52285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.52309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867218.52346: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867218.52526: variable 'network_connections' from source: task vars 13131 1726867218.52530: variable 'controller_profile' from source: play vars 13131 1726867218.52599: variable 'controller_profile' from source: play vars 13131 1726867218.52611: variable 'controller_device' from source: play vars 13131 1726867218.52674: variable 'controller_device' from source: play vars 13131 1726867218.52685: variable 'port1_profile' from source: play vars 13131 1726867218.52755: variable 'port1_profile' from source: play vars 13131 1726867218.52763: variable 'dhcp_interface1' from source: play vars 13131 1726867218.52833: variable 'dhcp_interface1' from source: play vars 13131 1726867218.52841: variable 'controller_profile' from source: play vars 13131 1726867218.52910: variable 'controller_profile' from source: play vars 13131 1726867218.52918: variable 'port2_profile' from source: play vars 13131 1726867218.52987: variable 'port2_profile' from source: play vars 13131 1726867218.52995: variable 'dhcp_interface2' from source: play vars 13131 1726867218.53066: variable 'dhcp_interface2' from source: play vars 13131 1726867218.53073: variable 'controller_profile' from source: play vars 13131 1726867218.53142: variable 'controller_profile' from source: play vars 13131 1726867218.53181: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867218.53234: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867218.53432: variable 'network_connections' from source: task vars 13131 1726867218.53436: variable 'controller_profile' from source: play vars 13131 1726867218.53484: variable 'controller_profile' from source: play vars 13131 1726867218.53488: variable 'controller_device' from source: play vars 13131 1726867218.53534: variable 'controller_device' from source: play vars 13131 1726867218.53541: variable 'port1_profile' from source: play vars 13131 1726867218.53592: variable 'port1_profile' from source: play vars 13131 1726867218.53595: variable 'dhcp_interface1' from source: play vars 13131 1726867218.53640: variable 'dhcp_interface1' from source: play vars 13131 1726867218.53645: variable 'controller_profile' from source: play vars 13131 1726867218.53694: variable 'controller_profile' from source: play vars 13131 1726867218.53697: variable 'port2_profile' from source: play vars 13131 1726867218.53743: variable 'port2_profile' from source: play vars 13131 1726867218.53749: variable 'dhcp_interface2' from source: play vars 13131 1726867218.53796: variable 'dhcp_interface2' from source: play vars 13131 1726867218.53799: variable 'controller_profile' from source: play vars 13131 1726867218.53847: variable 'controller_profile' from source: play vars 13131 1726867218.53866: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867218.53925: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867218.54119: variable 'network_connections' from source: task vars 13131 1726867218.54123: variable 'controller_profile' from source: play vars 13131 1726867218.54170: variable 'controller_profile' from source: play vars 13131 1726867218.54176: variable 'controller_device' from source: play vars 13131 1726867218.54224: variable 'controller_device' from source: play vars 13131 1726867218.54231: variable 'port1_profile' from source: play vars 13131 1726867218.54279: variable 'port1_profile' from source: play vars 13131 1726867218.54285: variable 'dhcp_interface1' from source: play vars 13131 1726867218.54331: variable 'dhcp_interface1' from source: play vars 13131 1726867218.54336: variable 'controller_profile' from source: play vars 13131 1726867218.54385: variable 'controller_profile' from source: play vars 13131 1726867218.54391: variable 'port2_profile' from source: play vars 13131 1726867218.54438: variable 'port2_profile' from source: play vars 13131 1726867218.54443: variable 'dhcp_interface2' from source: play vars 13131 1726867218.54493: variable 'dhcp_interface2' from source: play vars 13131 1726867218.54498: variable 'controller_profile' from source: play vars 13131 1726867218.54545: variable 'controller_profile' from source: play vars 13131 1726867218.54592: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867218.54636: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867218.54641: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867218.54686: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867218.54821: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867218.55166: variable 'network_connections' from source: task vars 13131 1726867218.55169: variable 'controller_profile' from source: play vars 13131 1726867218.55219: variable 'controller_profile' from source: play vars 13131 1726867218.55227: variable 'controller_device' from source: play vars 13131 1726867218.55269: variable 'controller_device' from source: play vars 13131 1726867218.55276: variable 'port1_profile' from source: play vars 13131 1726867218.55322: variable 'port1_profile' from source: play vars 13131 1726867218.55325: variable 'dhcp_interface1' from source: play vars 13131 1726867218.55368: variable 'dhcp_interface1' from source: play vars 13131 1726867218.55374: variable 'controller_profile' from source: play vars 13131 1726867218.55418: variable 'controller_profile' from source: play vars 13131 1726867218.55424: variable 'port2_profile' from source: play vars 13131 1726867218.55467: variable 'port2_profile' from source: play vars 13131 1726867218.55473: variable 'dhcp_interface2' from source: play vars 13131 1726867218.55517: variable 'dhcp_interface2' from source: play vars 13131 1726867218.55523: variable 'controller_profile' from source: play vars 13131 1726867218.55568: variable 'controller_profile' from source: play vars 13131 1726867218.55574: variable 'ansible_distribution' from source: facts 13131 1726867218.55578: variable '__network_rh_distros' from source: role '' defaults 13131 1726867218.55585: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.55603: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867218.55714: variable 'ansible_distribution' from source: facts 13131 1726867218.55717: variable '__network_rh_distros' from source: role '' defaults 13131 1726867218.55722: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.55732: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867218.55839: variable 'ansible_distribution' from source: facts 13131 1726867218.55842: variable '__network_rh_distros' from source: role '' defaults 13131 1726867218.55846: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.55872: variable 'network_provider' from source: set_fact 13131 1726867218.55886: variable 'ansible_facts' from source: unknown 13131 1726867218.56334: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13131 1726867218.56524: when evaluation is False, skipping this task 13131 1726867218.56528: _execute() done 13131 1726867218.56531: dumping result to json 13131 1726867218.56533: done dumping result, returning 13131 1726867218.56538: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-5f24-9b7a-0000000000d7] 13131 1726867218.56541: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d7 13131 1726867218.56609: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d7 13131 1726867218.56611: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13131 1726867218.56660: no more pending results, returning what we have 13131 1726867218.56663: results queue empty 13131 1726867218.56664: checking for any_errors_fatal 13131 1726867218.56670: done checking for any_errors_fatal 13131 1726867218.56671: checking for max_fail_percentage 13131 1726867218.56672: done checking for max_fail_percentage 13131 1726867218.56673: checking to see if all hosts have failed and the running result is not ok 13131 1726867218.56673: done checking to see if all hosts have failed 13131 1726867218.56674: getting the remaining hosts for this loop 13131 1726867218.56675: done getting the remaining hosts for this loop 13131 1726867218.56728: getting the next task for host managed_node1 13131 1726867218.56734: done getting next task for host managed_node1 13131 1726867218.56738: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867218.56741: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867218.56758: getting variables 13131 1726867218.56759: in VariableManager get_vars() 13131 1726867218.56808: Calling all_inventory to load vars for managed_node1 13131 1726867218.56811: Calling groups_inventory to load vars for managed_node1 13131 1726867218.56813: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867218.56822: Calling all_plugins_play to load vars for managed_node1 13131 1726867218.56825: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867218.56849: Calling groups_plugins_play to load vars for managed_node1 13131 1726867218.57917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867218.58791: done with get_vars() 13131 1726867218.58808: done getting variables 13131 1726867218.58849: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:20:18 -0400 (0:00:00.122) 0:00:33.699 ****** 13131 1726867218.58873: entering _queue_task() for managed_node1/package 13131 1726867218.59087: worker is 1 (out of 1 available) 13131 1726867218.59099: exiting _queue_task() for managed_node1/package 13131 1726867218.59112: done queuing things up, now waiting for results queue to drain 13131 1726867218.59114: waiting for pending results... 13131 1726867218.59492: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867218.59497: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d8 13131 1726867218.59500: variable 'ansible_search_path' from source: unknown 13131 1726867218.59504: variable 'ansible_search_path' from source: unknown 13131 1726867218.59507: calling self._execute() 13131 1726867218.59596: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.59607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.59618: variable 'omit' from source: magic vars 13131 1726867218.59966: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.59986: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867218.60101: variable 'network_state' from source: role '' defaults 13131 1726867218.60115: Evaluated conditional (network_state != {}): False 13131 1726867218.60121: when evaluation is False, skipping this task 13131 1726867218.60127: _execute() done 13131 1726867218.60132: dumping result to json 13131 1726867218.60137: done dumping result, returning 13131 1726867218.60147: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-0000000000d8] 13131 1726867218.60155: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d8 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867218.60303: no more pending results, returning what we have 13131 1726867218.60309: results queue empty 13131 1726867218.60310: checking for any_errors_fatal 13131 1726867218.60314: done checking for any_errors_fatal 13131 1726867218.60315: checking for max_fail_percentage 13131 1726867218.60317: done checking for max_fail_percentage 13131 1726867218.60318: checking to see if all hosts have failed and the running result is not ok 13131 1726867218.60319: done checking to see if all hosts have failed 13131 1726867218.60320: getting the remaining hosts for this loop 13131 1726867218.60321: done getting the remaining hosts for this loop 13131 1726867218.60324: getting the next task for host managed_node1 13131 1726867218.60331: done getting next task for host managed_node1 13131 1726867218.60334: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867218.60337: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867218.60359: getting variables 13131 1726867218.60360: in VariableManager get_vars() 13131 1726867218.60413: Calling all_inventory to load vars for managed_node1 13131 1726867218.60416: Calling groups_inventory to load vars for managed_node1 13131 1726867218.60419: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867218.60427: Calling all_plugins_play to load vars for managed_node1 13131 1726867218.60429: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867218.60432: Calling groups_plugins_play to load vars for managed_node1 13131 1726867218.61028: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d8 13131 1726867218.61032: WORKER PROCESS EXITING 13131 1726867218.61949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867218.63620: done with get_vars() 13131 1726867218.63640: done getting variables 13131 1726867218.63696: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:20:18 -0400 (0:00:00.048) 0:00:33.747 ****** 13131 1726867218.63738: entering _queue_task() for managed_node1/package 13131 1726867218.64037: worker is 1 (out of 1 available) 13131 1726867218.64163: exiting _queue_task() for managed_node1/package 13131 1726867218.64173: done queuing things up, now waiting for results queue to drain 13131 1726867218.64174: waiting for pending results... 13131 1726867218.64350: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867218.64507: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000d9 13131 1726867218.64530: variable 'ansible_search_path' from source: unknown 13131 1726867218.64538: variable 'ansible_search_path' from source: unknown 13131 1726867218.64576: calling self._execute() 13131 1726867218.64683: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.64695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.64715: variable 'omit' from source: magic vars 13131 1726867218.65104: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.65121: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867218.65257: variable 'network_state' from source: role '' defaults 13131 1726867218.65278: Evaluated conditional (network_state != {}): False 13131 1726867218.65286: when evaluation is False, skipping this task 13131 1726867218.65293: _execute() done 13131 1726867218.65299: dumping result to json 13131 1726867218.65309: done dumping result, returning 13131 1726867218.65320: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-0000000000d9] 13131 1726867218.65329: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d9 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867218.65489: no more pending results, returning what we have 13131 1726867218.65492: results queue empty 13131 1726867218.65493: checking for any_errors_fatal 13131 1726867218.65505: done checking for any_errors_fatal 13131 1726867218.65505: checking for max_fail_percentage 13131 1726867218.65508: done checking for max_fail_percentage 13131 1726867218.65508: checking to see if all hosts have failed and the running result is not ok 13131 1726867218.65509: done checking to see if all hosts have failed 13131 1726867218.65510: getting the remaining hosts for this loop 13131 1726867218.65511: done getting the remaining hosts for this loop 13131 1726867218.65515: getting the next task for host managed_node1 13131 1726867218.65522: done getting next task for host managed_node1 13131 1726867218.65525: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867218.65529: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867218.65551: getting variables 13131 1726867218.65552: in VariableManager get_vars() 13131 1726867218.65611: Calling all_inventory to load vars for managed_node1 13131 1726867218.65614: Calling groups_inventory to load vars for managed_node1 13131 1726867218.65616: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867218.65627: Calling all_plugins_play to load vars for managed_node1 13131 1726867218.65630: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867218.65633: Calling groups_plugins_play to load vars for managed_node1 13131 1726867218.66293: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000d9 13131 1726867218.66296: WORKER PROCESS EXITING 13131 1726867218.67248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867218.69066: done with get_vars() 13131 1726867218.69088: done getting variables 13131 1726867218.69150: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:20:18 -0400 (0:00:00.054) 0:00:33.802 ****** 13131 1726867218.69185: entering _queue_task() for managed_node1/service 13131 1726867218.69574: worker is 1 (out of 1 available) 13131 1726867218.69585: exiting _queue_task() for managed_node1/service 13131 1726867218.69595: done queuing things up, now waiting for results queue to drain 13131 1726867218.69596: waiting for pending results... 13131 1726867218.69770: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867218.69931: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000da 13131 1726867218.69951: variable 'ansible_search_path' from source: unknown 13131 1726867218.69960: variable 'ansible_search_path' from source: unknown 13131 1726867218.70000: calling self._execute() 13131 1726867218.70110: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.70127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.70141: variable 'omit' from source: magic vars 13131 1726867218.70524: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.70553: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867218.70669: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867218.70985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867218.73162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867218.73238: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867218.73290: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867218.73330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867218.73358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867218.73446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.73503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.73536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.73582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.73612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.73710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.73715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.73818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.73822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.73825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.73832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.73860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.73889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.73940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.73958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.74140: variable 'network_connections' from source: task vars 13131 1726867218.74162: variable 'controller_profile' from source: play vars 13131 1726867218.74230: variable 'controller_profile' from source: play vars 13131 1726867218.74255: variable 'controller_device' from source: play vars 13131 1726867218.74365: variable 'controller_device' from source: play vars 13131 1726867218.74373: variable 'port1_profile' from source: play vars 13131 1726867218.74404: variable 'port1_profile' from source: play vars 13131 1726867218.74417: variable 'dhcp_interface1' from source: play vars 13131 1726867218.74488: variable 'dhcp_interface1' from source: play vars 13131 1726867218.74500: variable 'controller_profile' from source: play vars 13131 1726867218.74563: variable 'controller_profile' from source: play vars 13131 1726867218.74574: variable 'port2_profile' from source: play vars 13131 1726867218.74649: variable 'port2_profile' from source: play vars 13131 1726867218.74682: variable 'dhcp_interface2' from source: play vars 13131 1726867218.74734: variable 'dhcp_interface2' from source: play vars 13131 1726867218.74745: variable 'controller_profile' from source: play vars 13131 1726867218.74816: variable 'controller_profile' from source: play vars 13131 1726867218.74920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867218.75076: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867218.75121: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867218.75163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867218.75247: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867218.75254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867218.75283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867218.75317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.75345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867218.75421: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867218.75673: variable 'network_connections' from source: task vars 13131 1726867218.75884: variable 'controller_profile' from source: play vars 13131 1726867218.75887: variable 'controller_profile' from source: play vars 13131 1726867218.75889: variable 'controller_device' from source: play vars 13131 1726867218.75891: variable 'controller_device' from source: play vars 13131 1726867218.75893: variable 'port1_profile' from source: play vars 13131 1726867218.75908: variable 'port1_profile' from source: play vars 13131 1726867218.75921: variable 'dhcp_interface1' from source: play vars 13131 1726867218.75982: variable 'dhcp_interface1' from source: play vars 13131 1726867218.75994: variable 'controller_profile' from source: play vars 13131 1726867218.76066: variable 'controller_profile' from source: play vars 13131 1726867218.76081: variable 'port2_profile' from source: play vars 13131 1726867218.76150: variable 'port2_profile' from source: play vars 13131 1726867218.76162: variable 'dhcp_interface2' from source: play vars 13131 1726867218.76235: variable 'dhcp_interface2' from source: play vars 13131 1726867218.76247: variable 'controller_profile' from source: play vars 13131 1726867218.76312: variable 'controller_profile' from source: play vars 13131 1726867218.76357: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867218.76366: when evaluation is False, skipping this task 13131 1726867218.76373: _execute() done 13131 1726867218.76382: dumping result to json 13131 1726867218.76389: done dumping result, returning 13131 1726867218.76404: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-0000000000da] 13131 1726867218.76413: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000da skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867218.76607: no more pending results, returning what we have 13131 1726867218.76610: results queue empty 13131 1726867218.76612: checking for any_errors_fatal 13131 1726867218.76618: done checking for any_errors_fatal 13131 1726867218.76619: checking for max_fail_percentage 13131 1726867218.76621: done checking for max_fail_percentage 13131 1726867218.76622: checking to see if all hosts have failed and the running result is not ok 13131 1726867218.76623: done checking to see if all hosts have failed 13131 1726867218.76624: getting the remaining hosts for this loop 13131 1726867218.76625: done getting the remaining hosts for this loop 13131 1726867218.76629: getting the next task for host managed_node1 13131 1726867218.76636: done getting next task for host managed_node1 13131 1726867218.76641: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867218.76644: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867218.76670: getting variables 13131 1726867218.76672: in VariableManager get_vars() 13131 1726867218.76735: Calling all_inventory to load vars for managed_node1 13131 1726867218.76738: Calling groups_inventory to load vars for managed_node1 13131 1726867218.76741: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867218.76752: Calling all_plugins_play to load vars for managed_node1 13131 1726867218.76756: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867218.76759: Calling groups_plugins_play to load vars for managed_node1 13131 1726867218.77393: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000da 13131 1726867218.77397: WORKER PROCESS EXITING 13131 1726867218.78431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867218.79298: done with get_vars() 13131 1726867218.79315: done getting variables 13131 1726867218.79356: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:20:18 -0400 (0:00:00.101) 0:00:33.904 ****** 13131 1726867218.79382: entering _queue_task() for managed_node1/service 13131 1726867218.79621: worker is 1 (out of 1 available) 13131 1726867218.79637: exiting _queue_task() for managed_node1/service 13131 1726867218.79649: done queuing things up, now waiting for results queue to drain 13131 1726867218.79650: waiting for pending results... 13131 1726867218.79827: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867218.79929: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000db 13131 1726867218.79942: variable 'ansible_search_path' from source: unknown 13131 1726867218.79945: variable 'ansible_search_path' from source: unknown 13131 1726867218.79982: calling self._execute() 13131 1726867218.80182: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.80186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.80188: variable 'omit' from source: magic vars 13131 1726867218.80475: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.80494: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867218.80658: variable 'network_provider' from source: set_fact 13131 1726867218.80668: variable 'network_state' from source: role '' defaults 13131 1726867218.80683: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13131 1726867218.80694: variable 'omit' from source: magic vars 13131 1726867218.80756: variable 'omit' from source: magic vars 13131 1726867218.80825: variable 'network_service_name' from source: role '' defaults 13131 1726867218.80863: variable 'network_service_name' from source: role '' defaults 13131 1726867218.80980: variable '__network_provider_setup' from source: role '' defaults 13131 1726867218.80992: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867218.81060: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867218.81075: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867218.81151: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867218.81318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867218.82746: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867218.82791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867218.82819: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867218.82854: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867218.82873: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867218.82934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.82954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.82971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.83006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.83015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.83047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.83084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.83097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.83125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.83136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.83499: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867218.83505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.83507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.83510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.83531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.83543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.83624: variable 'ansible_python' from source: facts 13131 1726867218.83648: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867218.83726: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867218.83800: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867218.83917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.83942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.83965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.84006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.84018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.84061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867218.84087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867218.84111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.84150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867218.84171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867218.84293: variable 'network_connections' from source: task vars 13131 1726867218.84300: variable 'controller_profile' from source: play vars 13131 1726867218.84368: variable 'controller_profile' from source: play vars 13131 1726867218.84389: variable 'controller_device' from source: play vars 13131 1726867218.84483: variable 'controller_device' from source: play vars 13131 1726867218.84491: variable 'port1_profile' from source: play vars 13131 1726867218.84527: variable 'port1_profile' from source: play vars 13131 1726867218.84606: variable 'dhcp_interface1' from source: play vars 13131 1726867218.84610: variable 'dhcp_interface1' from source: play vars 13131 1726867218.84617: variable 'controller_profile' from source: play vars 13131 1726867218.84687: variable 'controller_profile' from source: play vars 13131 1726867218.84706: variable 'port2_profile' from source: play vars 13131 1726867218.84783: variable 'port2_profile' from source: play vars 13131 1726867218.84786: variable 'dhcp_interface2' from source: play vars 13131 1726867218.84844: variable 'dhcp_interface2' from source: play vars 13131 1726867218.84855: variable 'controller_profile' from source: play vars 13131 1726867218.84940: variable 'controller_profile' from source: play vars 13131 1726867218.85028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867218.85159: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867218.85194: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867218.85467: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867218.85503: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867218.85543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867218.85563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867218.85589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867218.85615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867218.85652: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867218.85828: variable 'network_connections' from source: task vars 13131 1726867218.85834: variable 'controller_profile' from source: play vars 13131 1726867218.85886: variable 'controller_profile' from source: play vars 13131 1726867218.85894: variable 'controller_device' from source: play vars 13131 1726867218.85947: variable 'controller_device' from source: play vars 13131 1726867218.85957: variable 'port1_profile' from source: play vars 13131 1726867218.86008: variable 'port1_profile' from source: play vars 13131 1726867218.86018: variable 'dhcp_interface1' from source: play vars 13131 1726867218.86069: variable 'dhcp_interface1' from source: play vars 13131 1726867218.86079: variable 'controller_profile' from source: play vars 13131 1726867218.86129: variable 'controller_profile' from source: play vars 13131 1726867218.86138: variable 'port2_profile' from source: play vars 13131 1726867218.86190: variable 'port2_profile' from source: play vars 13131 1726867218.86198: variable 'dhcp_interface2' from source: play vars 13131 1726867218.86251: variable 'dhcp_interface2' from source: play vars 13131 1726867218.86260: variable 'controller_profile' from source: play vars 13131 1726867218.86311: variable 'controller_profile' from source: play vars 13131 1726867218.86345: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867218.86399: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867218.86608: variable 'network_connections' from source: task vars 13131 1726867218.86611: variable 'controller_profile' from source: play vars 13131 1726867218.86705: variable 'controller_profile' from source: play vars 13131 1726867218.86708: variable 'controller_device' from source: play vars 13131 1726867218.86766: variable 'controller_device' from source: play vars 13131 1726867218.86769: variable 'port1_profile' from source: play vars 13131 1726867218.86985: variable 'port1_profile' from source: play vars 13131 1726867218.86990: variable 'dhcp_interface1' from source: play vars 13131 1726867218.86993: variable 'dhcp_interface1' from source: play vars 13131 1726867218.86995: variable 'controller_profile' from source: play vars 13131 1726867218.86998: variable 'controller_profile' from source: play vars 13131 1726867218.87000: variable 'port2_profile' from source: play vars 13131 1726867218.87055: variable 'port2_profile' from source: play vars 13131 1726867218.87061: variable 'dhcp_interface2' from source: play vars 13131 1726867218.87126: variable 'dhcp_interface2' from source: play vars 13131 1726867218.87132: variable 'controller_profile' from source: play vars 13131 1726867218.87208: variable 'controller_profile' from source: play vars 13131 1726867218.87231: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867218.87314: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867218.87663: variable 'network_connections' from source: task vars 13131 1726867218.87666: variable 'controller_profile' from source: play vars 13131 1726867218.87700: variable 'controller_profile' from source: play vars 13131 1726867218.87706: variable 'controller_device' from source: play vars 13131 1726867218.87772: variable 'controller_device' from source: play vars 13131 1726867218.87786: variable 'port1_profile' from source: play vars 13131 1726867218.87851: variable 'port1_profile' from source: play vars 13131 1726867218.87854: variable 'dhcp_interface1' from source: play vars 13131 1726867218.87953: variable 'dhcp_interface1' from source: play vars 13131 1726867218.87956: variable 'controller_profile' from source: play vars 13131 1726867218.87992: variable 'controller_profile' from source: play vars 13131 1726867218.88079: variable 'port2_profile' from source: play vars 13131 1726867218.88083: variable 'port2_profile' from source: play vars 13131 1726867218.88086: variable 'dhcp_interface2' from source: play vars 13131 1726867218.88139: variable 'dhcp_interface2' from source: play vars 13131 1726867218.88147: variable 'controller_profile' from source: play vars 13131 1726867218.88206: variable 'controller_profile' from source: play vars 13131 1726867218.88258: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867218.88321: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867218.88327: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867218.88380: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867218.88668: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867218.89030: variable 'network_connections' from source: task vars 13131 1726867218.89033: variable 'controller_profile' from source: play vars 13131 1726867218.89091: variable 'controller_profile' from source: play vars 13131 1726867218.89097: variable 'controller_device' from source: play vars 13131 1726867218.89153: variable 'controller_device' from source: play vars 13131 1726867218.89161: variable 'port1_profile' from source: play vars 13131 1726867218.89217: variable 'port1_profile' from source: play vars 13131 1726867218.89224: variable 'dhcp_interface1' from source: play vars 13131 1726867218.89282: variable 'dhcp_interface1' from source: play vars 13131 1726867218.89289: variable 'controller_profile' from source: play vars 13131 1726867218.89343: variable 'controller_profile' from source: play vars 13131 1726867218.89351: variable 'port2_profile' from source: play vars 13131 1726867218.89410: variable 'port2_profile' from source: play vars 13131 1726867218.89413: variable 'dhcp_interface2' from source: play vars 13131 1726867218.89468: variable 'dhcp_interface2' from source: play vars 13131 1726867218.89474: variable 'controller_profile' from source: play vars 13131 1726867218.89530: variable 'controller_profile' from source: play vars 13131 1726867218.89538: variable 'ansible_distribution' from source: facts 13131 1726867218.89548: variable '__network_rh_distros' from source: role '' defaults 13131 1726867218.89551: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.89591: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867218.89722: variable 'ansible_distribution' from source: facts 13131 1726867218.89725: variable '__network_rh_distros' from source: role '' defaults 13131 1726867218.89730: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.89741: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867218.89856: variable 'ansible_distribution' from source: facts 13131 1726867218.89859: variable '__network_rh_distros' from source: role '' defaults 13131 1726867218.89870: variable 'ansible_distribution_major_version' from source: facts 13131 1726867218.89893: variable 'network_provider' from source: set_fact 13131 1726867218.89910: variable 'omit' from source: magic vars 13131 1726867218.89934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867218.89954: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867218.89968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867218.89984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867218.89993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867218.90015: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867218.90018: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.90021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.90088: Set connection var ansible_connection to ssh 13131 1726867218.90094: Set connection var ansible_timeout to 10 13131 1726867218.90097: Set connection var ansible_shell_type to sh 13131 1726867218.90106: Set connection var ansible_shell_executable to /bin/sh 13131 1726867218.90113: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867218.90117: Set connection var ansible_pipelining to False 13131 1726867218.90135: variable 'ansible_shell_executable' from source: unknown 13131 1726867218.90138: variable 'ansible_connection' from source: unknown 13131 1726867218.90140: variable 'ansible_module_compression' from source: unknown 13131 1726867218.90142: variable 'ansible_shell_type' from source: unknown 13131 1726867218.90144: variable 'ansible_shell_executable' from source: unknown 13131 1726867218.90147: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867218.90151: variable 'ansible_pipelining' from source: unknown 13131 1726867218.90153: variable 'ansible_timeout' from source: unknown 13131 1726867218.90157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867218.90232: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867218.90240: variable 'omit' from source: magic vars 13131 1726867218.90246: starting attempt loop 13131 1726867218.90248: running the handler 13131 1726867218.90307: variable 'ansible_facts' from source: unknown 13131 1726867218.90698: _low_level_execute_command(): starting 13131 1726867218.90709: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867218.91145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867218.91183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867218.91186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867218.91190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867218.91192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867218.91228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867218.91240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867218.91302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867218.92992: stdout chunk (state=3): >>>/root <<< 13131 1726867218.93093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867218.93117: stderr chunk (state=3): >>><<< 13131 1726867218.93120: stdout chunk (state=3): >>><<< 13131 1726867218.93141: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867218.93151: _low_level_execute_command(): starting 13131 1726867218.93156: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665 `" && echo ansible-tmp-1726867218.9313962-14792-77551012064665="` echo /root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665 `" ) && sleep 0' 13131 1726867218.93544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867218.93582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867218.93585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867218.93588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867218.93590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867218.93592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867218.93635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867218.93638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867218.93693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867218.95572: stdout chunk (state=3): >>>ansible-tmp-1726867218.9313962-14792-77551012064665=/root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665 <<< 13131 1726867218.95682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867218.95704: stderr chunk (state=3): >>><<< 13131 1726867218.95709: stdout chunk (state=3): >>><<< 13131 1726867218.95723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867218.9313962-14792-77551012064665=/root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867218.95749: variable 'ansible_module_compression' from source: unknown 13131 1726867218.95788: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13131 1726867218.95843: variable 'ansible_facts' from source: unknown 13131 1726867218.95974: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/AnsiballZ_systemd.py 13131 1726867218.96075: Sending initial data 13131 1726867218.96081: Sent initial data (155 bytes) 13131 1726867218.96464: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867218.96495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867218.96498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867218.96500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867218.96504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867218.96506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867218.96555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867218.96558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867218.96607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867218.98151: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867218.98154: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867218.98194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867218.98245: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpj0dbh1rr /root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/AnsiballZ_systemd.py <<< 13131 1726867218.98248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/AnsiballZ_systemd.py" <<< 13131 1726867218.98292: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpj0dbh1rr" to remote "/root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/AnsiballZ_systemd.py" <<< 13131 1726867218.98295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/AnsiballZ_systemd.py" <<< 13131 1726867218.99375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867218.99407: stderr chunk (state=3): >>><<< 13131 1726867218.99412: stdout chunk (state=3): >>><<< 13131 1726867218.99443: done transferring module to remote 13131 1726867218.99451: _low_level_execute_command(): starting 13131 1726867218.99454: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/ /root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/AnsiballZ_systemd.py && sleep 0' 13131 1726867218.99861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867218.99864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867218.99866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867218.99868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867218.99870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867218.99905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867218.99917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867218.99969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867219.01744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867219.01768: stderr chunk (state=3): >>><<< 13131 1726867219.01770: stdout chunk (state=3): >>><<< 13131 1726867219.01781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867219.01791: _low_level_execute_command(): starting 13131 1726867219.01794: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/AnsiballZ_systemd.py && sleep 0' 13131 1726867219.02203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867219.02207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867219.02209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867219.02211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867219.02213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867219.02261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867219.02264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867219.02320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867219.31371: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call or<<< 13131 1726867219.31387: stdout chunk (state=3): >>>g.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10731520", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297738752", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "877547000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "Private<<< 13131 1726867219.31423: stdout chunk (state=3): >>>IPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13131 1726867219.33249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867219.33284: stderr chunk (state=3): >>><<< 13131 1726867219.33287: stdout chunk (state=3): >>><<< 13131 1726867219.33303: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10731520", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297738752", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "877547000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867219.33422: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867219.33438: _low_level_execute_command(): starting 13131 1726867219.33443: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867218.9313962-14792-77551012064665/ > /dev/null 2>&1 && sleep 0' 13131 1726867219.33872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867219.33876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867219.33914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867219.33917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867219.33919: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867219.33921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867219.33923: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867219.33980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867219.33983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867219.33988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867219.34032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867219.35835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867219.35860: stderr chunk (state=3): >>><<< 13131 1726867219.35863: stdout chunk (state=3): >>><<< 13131 1726867219.35881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867219.35887: handler run complete 13131 1726867219.35924: attempt loop complete, returning result 13131 1726867219.35927: _execute() done 13131 1726867219.35929: dumping result to json 13131 1726867219.35941: done dumping result, returning 13131 1726867219.35949: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-5f24-9b7a-0000000000db] 13131 1726867219.35952: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000db 13131 1726867219.36145: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000db 13131 1726867219.36148: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867219.36206: no more pending results, returning what we have 13131 1726867219.36209: results queue empty 13131 1726867219.36210: checking for any_errors_fatal 13131 1726867219.36217: done checking for any_errors_fatal 13131 1726867219.36218: checking for max_fail_percentage 13131 1726867219.36220: done checking for max_fail_percentage 13131 1726867219.36220: checking to see if all hosts have failed and the running result is not ok 13131 1726867219.36221: done checking to see if all hosts have failed 13131 1726867219.36222: getting the remaining hosts for this loop 13131 1726867219.36223: done getting the remaining hosts for this loop 13131 1726867219.36226: getting the next task for host managed_node1 13131 1726867219.36232: done getting next task for host managed_node1 13131 1726867219.36236: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867219.36239: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867219.36251: getting variables 13131 1726867219.36252: in VariableManager get_vars() 13131 1726867219.36336: Calling all_inventory to load vars for managed_node1 13131 1726867219.36339: Calling groups_inventory to load vars for managed_node1 13131 1726867219.36342: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867219.36350: Calling all_plugins_play to load vars for managed_node1 13131 1726867219.36353: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867219.36355: Calling groups_plugins_play to load vars for managed_node1 13131 1726867219.37263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867219.38130: done with get_vars() 13131 1726867219.38146: done getting variables 13131 1726867219.38190: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:20:19 -0400 (0:00:00.588) 0:00:34.492 ****** 13131 1726867219.38215: entering _queue_task() for managed_node1/service 13131 1726867219.38456: worker is 1 (out of 1 available) 13131 1726867219.38471: exiting _queue_task() for managed_node1/service 13131 1726867219.38484: done queuing things up, now waiting for results queue to drain 13131 1726867219.38485: waiting for pending results... 13131 1726867219.38650: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867219.38746: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000dc 13131 1726867219.38759: variable 'ansible_search_path' from source: unknown 13131 1726867219.38762: variable 'ansible_search_path' from source: unknown 13131 1726867219.38792: calling self._execute() 13131 1726867219.38869: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867219.38872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867219.38884: variable 'omit' from source: magic vars 13131 1726867219.39149: variable 'ansible_distribution_major_version' from source: facts 13131 1726867219.39163: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867219.39242: variable 'network_provider' from source: set_fact 13131 1726867219.39247: Evaluated conditional (network_provider == "nm"): True 13131 1726867219.39317: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867219.39379: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867219.39500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867219.40925: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867219.40971: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867219.41001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867219.41030: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867219.41050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867219.41121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867219.41142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867219.41159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867219.41186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867219.41197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867219.41234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867219.41251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867219.41267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867219.41293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867219.41303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867219.41337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867219.41352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867219.41368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867219.41393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867219.41403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867219.41499: variable 'network_connections' from source: task vars 13131 1726867219.41511: variable 'controller_profile' from source: play vars 13131 1726867219.41557: variable 'controller_profile' from source: play vars 13131 1726867219.41568: variable 'controller_device' from source: play vars 13131 1726867219.41614: variable 'controller_device' from source: play vars 13131 1726867219.41621: variable 'port1_profile' from source: play vars 13131 1726867219.41667: variable 'port1_profile' from source: play vars 13131 1726867219.41672: variable 'dhcp_interface1' from source: play vars 13131 1726867219.41718: variable 'dhcp_interface1' from source: play vars 13131 1726867219.41724: variable 'controller_profile' from source: play vars 13131 1726867219.41768: variable 'controller_profile' from source: play vars 13131 1726867219.41774: variable 'port2_profile' from source: play vars 13131 1726867219.41818: variable 'port2_profile' from source: play vars 13131 1726867219.41824: variable 'dhcp_interface2' from source: play vars 13131 1726867219.41868: variable 'dhcp_interface2' from source: play vars 13131 1726867219.41874: variable 'controller_profile' from source: play vars 13131 1726867219.41919: variable 'controller_profile' from source: play vars 13131 1726867219.41974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867219.42084: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867219.42110: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867219.42132: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867219.42153: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867219.42186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867219.42202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867219.42221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867219.42242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867219.42281: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867219.42443: variable 'network_connections' from source: task vars 13131 1726867219.42451: variable 'controller_profile' from source: play vars 13131 1726867219.42495: variable 'controller_profile' from source: play vars 13131 1726867219.42501: variable 'controller_device' from source: play vars 13131 1726867219.42547: variable 'controller_device' from source: play vars 13131 1726867219.42554: variable 'port1_profile' from source: play vars 13131 1726867219.42596: variable 'port1_profile' from source: play vars 13131 1726867219.42602: variable 'dhcp_interface1' from source: play vars 13131 1726867219.42647: variable 'dhcp_interface1' from source: play vars 13131 1726867219.42652: variable 'controller_profile' from source: play vars 13131 1726867219.42694: variable 'controller_profile' from source: play vars 13131 1726867219.42700: variable 'port2_profile' from source: play vars 13131 1726867219.42745: variable 'port2_profile' from source: play vars 13131 1726867219.42751: variable 'dhcp_interface2' from source: play vars 13131 1726867219.42793: variable 'dhcp_interface2' from source: play vars 13131 1726867219.42799: variable 'controller_profile' from source: play vars 13131 1726867219.42845: variable 'controller_profile' from source: play vars 13131 1726867219.42873: Evaluated conditional (__network_wpa_supplicant_required): False 13131 1726867219.42876: when evaluation is False, skipping this task 13131 1726867219.42880: _execute() done 13131 1726867219.42883: dumping result to json 13131 1726867219.42885: done dumping result, returning 13131 1726867219.42892: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-5f24-9b7a-0000000000dc] 13131 1726867219.42895: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000dc 13131 1726867219.42980: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000dc 13131 1726867219.42982: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13131 1726867219.43025: no more pending results, returning what we have 13131 1726867219.43029: results queue empty 13131 1726867219.43030: checking for any_errors_fatal 13131 1726867219.43043: done checking for any_errors_fatal 13131 1726867219.43044: checking for max_fail_percentage 13131 1726867219.43046: done checking for max_fail_percentage 13131 1726867219.43046: checking to see if all hosts have failed and the running result is not ok 13131 1726867219.43047: done checking to see if all hosts have failed 13131 1726867219.43048: getting the remaining hosts for this loop 13131 1726867219.43049: done getting the remaining hosts for this loop 13131 1726867219.43052: getting the next task for host managed_node1 13131 1726867219.43059: done getting next task for host managed_node1 13131 1726867219.43063: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867219.43065: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867219.43086: getting variables 13131 1726867219.43087: in VariableManager get_vars() 13131 1726867219.43136: Calling all_inventory to load vars for managed_node1 13131 1726867219.43139: Calling groups_inventory to load vars for managed_node1 13131 1726867219.43141: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867219.43150: Calling all_plugins_play to load vars for managed_node1 13131 1726867219.43152: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867219.43155: Calling groups_plugins_play to load vars for managed_node1 13131 1726867219.43933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867219.44793: done with get_vars() 13131 1726867219.44810: done getting variables 13131 1726867219.44850: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:20:19 -0400 (0:00:00.066) 0:00:34.559 ****** 13131 1726867219.44870: entering _queue_task() for managed_node1/service 13131 1726867219.45085: worker is 1 (out of 1 available) 13131 1726867219.45100: exiting _queue_task() for managed_node1/service 13131 1726867219.45111: done queuing things up, now waiting for results queue to drain 13131 1726867219.45112: waiting for pending results... 13131 1726867219.45282: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867219.45367: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000dd 13131 1726867219.45380: variable 'ansible_search_path' from source: unknown 13131 1726867219.45383: variable 'ansible_search_path' from source: unknown 13131 1726867219.45413: calling self._execute() 13131 1726867219.45683: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867219.45687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867219.45690: variable 'omit' from source: magic vars 13131 1726867219.45873: variable 'ansible_distribution_major_version' from source: facts 13131 1726867219.45893: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867219.46020: variable 'network_provider' from source: set_fact 13131 1726867219.46035: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867219.46044: when evaluation is False, skipping this task 13131 1726867219.46053: _execute() done 13131 1726867219.46062: dumping result to json 13131 1726867219.46071: done dumping result, returning 13131 1726867219.46087: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-5f24-9b7a-0000000000dd] 13131 1726867219.46097: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000dd skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867219.46285: no more pending results, returning what we have 13131 1726867219.46290: results queue empty 13131 1726867219.46291: checking for any_errors_fatal 13131 1726867219.46299: done checking for any_errors_fatal 13131 1726867219.46300: checking for max_fail_percentage 13131 1726867219.46304: done checking for max_fail_percentage 13131 1726867219.46305: checking to see if all hosts have failed and the running result is not ok 13131 1726867219.46306: done checking to see if all hosts have failed 13131 1726867219.46307: getting the remaining hosts for this loop 13131 1726867219.46309: done getting the remaining hosts for this loop 13131 1726867219.46312: getting the next task for host managed_node1 13131 1726867219.46320: done getting next task for host managed_node1 13131 1726867219.46323: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867219.46327: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867219.46351: getting variables 13131 1726867219.46353: in VariableManager get_vars() 13131 1726867219.46409: Calling all_inventory to load vars for managed_node1 13131 1726867219.46412: Calling groups_inventory to load vars for managed_node1 13131 1726867219.46415: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867219.46427: Calling all_plugins_play to load vars for managed_node1 13131 1726867219.46430: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867219.46434: Calling groups_plugins_play to load vars for managed_node1 13131 1726867219.47092: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000dd 13131 1726867219.47095: WORKER PROCESS EXITING 13131 1726867219.48051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867219.49622: done with get_vars() 13131 1726867219.49644: done getting variables 13131 1726867219.49709: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:20:19 -0400 (0:00:00.048) 0:00:34.607 ****** 13131 1726867219.49741: entering _queue_task() for managed_node1/copy 13131 1726867219.50008: worker is 1 (out of 1 available) 13131 1726867219.50020: exiting _queue_task() for managed_node1/copy 13131 1726867219.50031: done queuing things up, now waiting for results queue to drain 13131 1726867219.50033: waiting for pending results... 13131 1726867219.50301: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867219.50393: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000de 13131 1726867219.50406: variable 'ansible_search_path' from source: unknown 13131 1726867219.50409: variable 'ansible_search_path' from source: unknown 13131 1726867219.50438: calling self._execute() 13131 1726867219.50514: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867219.50518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867219.50527: variable 'omit' from source: magic vars 13131 1726867219.50795: variable 'ansible_distribution_major_version' from source: facts 13131 1726867219.50804: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867219.50882: variable 'network_provider' from source: set_fact 13131 1726867219.50886: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867219.50889: when evaluation is False, skipping this task 13131 1726867219.50891: _execute() done 13131 1726867219.50894: dumping result to json 13131 1726867219.50898: done dumping result, returning 13131 1726867219.50914: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-5f24-9b7a-0000000000de] 13131 1726867219.50917: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000de 13131 1726867219.50998: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000de 13131 1726867219.51001: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867219.51054: no more pending results, returning what we have 13131 1726867219.51058: results queue empty 13131 1726867219.51058: checking for any_errors_fatal 13131 1726867219.51063: done checking for any_errors_fatal 13131 1726867219.51064: checking for max_fail_percentage 13131 1726867219.51065: done checking for max_fail_percentage 13131 1726867219.51066: checking to see if all hosts have failed and the running result is not ok 13131 1726867219.51067: done checking to see if all hosts have failed 13131 1726867219.51067: getting the remaining hosts for this loop 13131 1726867219.51069: done getting the remaining hosts for this loop 13131 1726867219.51071: getting the next task for host managed_node1 13131 1726867219.51079: done getting next task for host managed_node1 13131 1726867219.51082: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867219.51085: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867219.51102: getting variables 13131 1726867219.51103: in VariableManager get_vars() 13131 1726867219.51144: Calling all_inventory to load vars for managed_node1 13131 1726867219.51146: Calling groups_inventory to load vars for managed_node1 13131 1726867219.51148: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867219.51156: Calling all_plugins_play to load vars for managed_node1 13131 1726867219.51158: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867219.51160: Calling groups_plugins_play to load vars for managed_node1 13131 1726867219.51870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867219.52811: done with get_vars() 13131 1726867219.52825: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:20:19 -0400 (0:00:00.031) 0:00:34.639 ****** 13131 1726867219.52881: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867219.53063: worker is 1 (out of 1 available) 13131 1726867219.53078: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867219.53089: done queuing things up, now waiting for results queue to drain 13131 1726867219.53090: waiting for pending results... 13131 1726867219.53247: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867219.53335: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000df 13131 1726867219.53347: variable 'ansible_search_path' from source: unknown 13131 1726867219.53351: variable 'ansible_search_path' from source: unknown 13131 1726867219.53378: calling self._execute() 13131 1726867219.53447: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867219.53452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867219.53461: variable 'omit' from source: magic vars 13131 1726867219.53720: variable 'ansible_distribution_major_version' from source: facts 13131 1726867219.53729: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867219.53735: variable 'omit' from source: magic vars 13131 1726867219.53774: variable 'omit' from source: magic vars 13131 1726867219.53884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867219.59697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867219.59769: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867219.59983: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867219.59986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867219.59989: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867219.59992: variable 'network_provider' from source: set_fact 13131 1726867219.60071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867219.60108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867219.60139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867219.60186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867219.60208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867219.60282: variable 'omit' from source: magic vars 13131 1726867219.60394: variable 'omit' from source: magic vars 13131 1726867219.60496: variable 'network_connections' from source: task vars 13131 1726867219.60515: variable 'controller_profile' from source: play vars 13131 1726867219.60575: variable 'controller_profile' from source: play vars 13131 1726867219.60590: variable 'controller_device' from source: play vars 13131 1726867219.60653: variable 'controller_device' from source: play vars 13131 1726867219.60661: variable 'port1_profile' from source: play vars 13131 1726867219.60726: variable 'port1_profile' from source: play vars 13131 1726867219.60738: variable 'dhcp_interface1' from source: play vars 13131 1726867219.60799: variable 'dhcp_interface1' from source: play vars 13131 1726867219.60814: variable 'controller_profile' from source: play vars 13131 1726867219.60875: variable 'controller_profile' from source: play vars 13131 1726867219.60890: variable 'port2_profile' from source: play vars 13131 1726867219.60951: variable 'port2_profile' from source: play vars 13131 1726867219.60962: variable 'dhcp_interface2' from source: play vars 13131 1726867219.61023: variable 'dhcp_interface2' from source: play vars 13131 1726867219.61033: variable 'controller_profile' from source: play vars 13131 1726867219.61084: variable 'controller_profile' from source: play vars 13131 1726867219.61242: variable 'omit' from source: magic vars 13131 1726867219.61255: variable '__lsr_ansible_managed' from source: task vars 13131 1726867219.61315: variable '__lsr_ansible_managed' from source: task vars 13131 1726867219.61479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13131 1726867219.61678: Loaded config def from plugin (lookup/template) 13131 1726867219.61687: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13131 1726867219.61714: File lookup term: get_ansible_managed.j2 13131 1726867219.61782: variable 'ansible_search_path' from source: unknown 13131 1726867219.61787: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13131 1726867219.61792: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13131 1726867219.61795: variable 'ansible_search_path' from source: unknown 13131 1726867219.67505: variable 'ansible_managed' from source: unknown 13131 1726867219.67628: variable 'omit' from source: magic vars 13131 1726867219.67659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867219.67686: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867219.67706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867219.67725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867219.67737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867219.67760: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867219.67982: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867219.67985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867219.67987: Set connection var ansible_connection to ssh 13131 1726867219.67989: Set connection var ansible_timeout to 10 13131 1726867219.67991: Set connection var ansible_shell_type to sh 13131 1726867219.67993: Set connection var ansible_shell_executable to /bin/sh 13131 1726867219.67995: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867219.68005: Set connection var ansible_pipelining to False 13131 1726867219.68007: variable 'ansible_shell_executable' from source: unknown 13131 1726867219.68009: variable 'ansible_connection' from source: unknown 13131 1726867219.68011: variable 'ansible_module_compression' from source: unknown 13131 1726867219.68013: variable 'ansible_shell_type' from source: unknown 13131 1726867219.68015: variable 'ansible_shell_executable' from source: unknown 13131 1726867219.68016: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867219.68019: variable 'ansible_pipelining' from source: unknown 13131 1726867219.68020: variable 'ansible_timeout' from source: unknown 13131 1726867219.68022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867219.68099: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867219.68115: variable 'omit' from source: magic vars 13131 1726867219.68126: starting attempt loop 13131 1726867219.68132: running the handler 13131 1726867219.68145: _low_level_execute_command(): starting 13131 1726867219.68153: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867219.68815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867219.68830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867219.68890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867219.68948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867219.68964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867219.68994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867219.69084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867219.70768: stdout chunk (state=3): >>>/root <<< 13131 1726867219.70897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867219.70924: stderr chunk (state=3): >>><<< 13131 1726867219.70942: stdout chunk (state=3): >>><<< 13131 1726867219.70965: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867219.70985: _low_level_execute_command(): starting 13131 1726867219.70996: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173 `" && echo ansible-tmp-1726867219.709717-14815-134170599672173="` echo /root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173 `" ) && sleep 0' 13131 1726867219.71650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867219.71667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867219.71685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867219.71763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867219.73630: stdout chunk (state=3): >>>ansible-tmp-1726867219.709717-14815-134170599672173=/root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173 <<< 13131 1726867219.73769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867219.73786: stdout chunk (state=3): >>><<< 13131 1726867219.73797: stderr chunk (state=3): >>><<< 13131 1726867219.73820: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867219.709717-14815-134170599672173=/root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867219.73862: variable 'ansible_module_compression' from source: unknown 13131 1726867219.73915: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13131 1726867219.73952: variable 'ansible_facts' from source: unknown 13131 1726867219.74152: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/AnsiballZ_network_connections.py 13131 1726867219.74276: Sending initial data 13131 1726867219.74290: Sent initial data (167 bytes) 13131 1726867219.74832: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867219.74846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867219.74861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867219.74880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867219.74898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867219.74991: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867219.75011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867219.75031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867219.75051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867219.75127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867219.76657: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867219.76728: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867219.76780: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpvs314q13 /root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/AnsiballZ_network_connections.py <<< 13131 1726867219.76829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/AnsiballZ_network_connections.py" <<< 13131 1726867219.76833: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13131 1726867219.76855: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpvs314q13" to remote "/root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/AnsiballZ_network_connections.py" <<< 13131 1726867219.77935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867219.77972: stderr chunk (state=3): >>><<< 13131 1726867219.77985: stdout chunk (state=3): >>><<< 13131 1726867219.78072: done transferring module to remote 13131 1726867219.78075: _low_level_execute_command(): starting 13131 1726867219.78080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/ /root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/AnsiballZ_network_connections.py && sleep 0' 13131 1726867219.78669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867219.78683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867219.78696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867219.78714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867219.78731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867219.78741: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867219.78752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867219.78791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867219.78855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867219.78869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867219.78891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867219.78965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867219.80768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867219.80780: stdout chunk (state=3): >>><<< 13131 1726867219.80797: stderr chunk (state=3): >>><<< 13131 1726867219.80822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867219.80838: _low_level_execute_command(): starting 13131 1726867219.80883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/AnsiballZ_network_connections.py && sleep 0' 13131 1726867219.81461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867219.81475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867219.81498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867219.81527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867219.81625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867219.81650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867219.81739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867220.32084: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13131 1726867220.34049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867220.34076: stderr chunk (state=3): >>><<< 13131 1726867220.34082: stdout chunk (state=3): >>><<< 13131 1726867220.34106: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867220.34140: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867220.34148: _low_level_execute_command(): starting 13131 1726867220.34151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867219.709717-14815-134170599672173/ > /dev/null 2>&1 && sleep 0' 13131 1726867220.34603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867220.34606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867220.34609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867220.34611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867220.34613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.34658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867220.34666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867220.34726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867220.36574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867220.36583: stdout chunk (state=3): >>><<< 13131 1726867220.36589: stderr chunk (state=3): >>><<< 13131 1726867220.36600: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867220.36605: handler run complete 13131 1726867220.36629: attempt loop complete, returning result 13131 1726867220.36632: _execute() done 13131 1726867220.36634: dumping result to json 13131 1726867220.36641: done dumping result, returning 13131 1726867220.36648: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-5f24-9b7a-0000000000df] 13131 1726867220.36651: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000df 13131 1726867220.36763: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000df 13131 1726867220.36766: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb [008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a [009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active) 13131 1726867220.36894: no more pending results, returning what we have 13131 1726867220.36898: results queue empty 13131 1726867220.36899: checking for any_errors_fatal 13131 1726867220.36906: done checking for any_errors_fatal 13131 1726867220.36907: checking for max_fail_percentage 13131 1726867220.36908: done checking for max_fail_percentage 13131 1726867220.36909: checking to see if all hosts have failed and the running result is not ok 13131 1726867220.36910: done checking to see if all hosts have failed 13131 1726867220.36910: getting the remaining hosts for this loop 13131 1726867220.36912: done getting the remaining hosts for this loop 13131 1726867220.36915: getting the next task for host managed_node1 13131 1726867220.36920: done getting next task for host managed_node1 13131 1726867220.36926: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867220.36929: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867220.36940: getting variables 13131 1726867220.36941: in VariableManager get_vars() 13131 1726867220.36994: Calling all_inventory to load vars for managed_node1 13131 1726867220.36997: Calling groups_inventory to load vars for managed_node1 13131 1726867220.36999: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867220.37011: Calling all_plugins_play to load vars for managed_node1 13131 1726867220.37013: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867220.37015: Calling groups_plugins_play to load vars for managed_node1 13131 1726867220.41475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867220.42322: done with get_vars() 13131 1726867220.42336: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:20 -0400 (0:00:00.895) 0:00:35.534 ****** 13131 1726867220.42385: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867220.42654: worker is 1 (out of 1 available) 13131 1726867220.42668: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867220.42681: done queuing things up, now waiting for results queue to drain 13131 1726867220.42683: waiting for pending results... 13131 1726867220.42852: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867220.42947: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000e0 13131 1726867220.42959: variable 'ansible_search_path' from source: unknown 13131 1726867220.42962: variable 'ansible_search_path' from source: unknown 13131 1726867220.42996: calling self._execute() 13131 1726867220.43078: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.43083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.43092: variable 'omit' from source: magic vars 13131 1726867220.43378: variable 'ansible_distribution_major_version' from source: facts 13131 1726867220.43388: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867220.43471: variable 'network_state' from source: role '' defaults 13131 1726867220.43480: Evaluated conditional (network_state != {}): False 13131 1726867220.43483: when evaluation is False, skipping this task 13131 1726867220.43485: _execute() done 13131 1726867220.43488: dumping result to json 13131 1726867220.43491: done dumping result, returning 13131 1726867220.43498: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-5f24-9b7a-0000000000e0] 13131 1726867220.43502: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e0 13131 1726867220.43596: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e0 13131 1726867220.43599: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867220.43649: no more pending results, returning what we have 13131 1726867220.43652: results queue empty 13131 1726867220.43653: checking for any_errors_fatal 13131 1726867220.43669: done checking for any_errors_fatal 13131 1726867220.43670: checking for max_fail_percentage 13131 1726867220.43671: done checking for max_fail_percentage 13131 1726867220.43673: checking to see if all hosts have failed and the running result is not ok 13131 1726867220.43673: done checking to see if all hosts have failed 13131 1726867220.43674: getting the remaining hosts for this loop 13131 1726867220.43676: done getting the remaining hosts for this loop 13131 1726867220.43684: getting the next task for host managed_node1 13131 1726867220.43690: done getting next task for host managed_node1 13131 1726867220.43693: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867220.43696: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867220.43716: getting variables 13131 1726867220.43717: in VariableManager get_vars() 13131 1726867220.43757: Calling all_inventory to load vars for managed_node1 13131 1726867220.43759: Calling groups_inventory to load vars for managed_node1 13131 1726867220.43761: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867220.43769: Calling all_plugins_play to load vars for managed_node1 13131 1726867220.43771: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867220.43774: Calling groups_plugins_play to load vars for managed_node1 13131 1726867220.44566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867220.45423: done with get_vars() 13131 1726867220.45437: done getting variables 13131 1726867220.45475: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:20 -0400 (0:00:00.031) 0:00:35.565 ****** 13131 1726867220.45499: entering _queue_task() for managed_node1/debug 13131 1726867220.45700: worker is 1 (out of 1 available) 13131 1726867220.45713: exiting _queue_task() for managed_node1/debug 13131 1726867220.45725: done queuing things up, now waiting for results queue to drain 13131 1726867220.45726: waiting for pending results... 13131 1726867220.45885: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867220.45979: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000e1 13131 1726867220.45990: variable 'ansible_search_path' from source: unknown 13131 1726867220.45994: variable 'ansible_search_path' from source: unknown 13131 1726867220.46024: calling self._execute() 13131 1726867220.46096: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.46100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.46112: variable 'omit' from source: magic vars 13131 1726867220.46368: variable 'ansible_distribution_major_version' from source: facts 13131 1726867220.46376: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867220.46384: variable 'omit' from source: magic vars 13131 1726867220.46427: variable 'omit' from source: magic vars 13131 1726867220.46453: variable 'omit' from source: magic vars 13131 1726867220.46483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867220.46514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867220.46530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867220.46543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867220.46552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867220.46575: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867220.46579: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.46584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.46651: Set connection var ansible_connection to ssh 13131 1726867220.46657: Set connection var ansible_timeout to 10 13131 1726867220.46660: Set connection var ansible_shell_type to sh 13131 1726867220.46667: Set connection var ansible_shell_executable to /bin/sh 13131 1726867220.46674: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867220.46680: Set connection var ansible_pipelining to False 13131 1726867220.46697: variable 'ansible_shell_executable' from source: unknown 13131 1726867220.46700: variable 'ansible_connection' from source: unknown 13131 1726867220.46707: variable 'ansible_module_compression' from source: unknown 13131 1726867220.46709: variable 'ansible_shell_type' from source: unknown 13131 1726867220.46713: variable 'ansible_shell_executable' from source: unknown 13131 1726867220.46716: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.46718: variable 'ansible_pipelining' from source: unknown 13131 1726867220.46720: variable 'ansible_timeout' from source: unknown 13131 1726867220.46722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.46815: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867220.46823: variable 'omit' from source: magic vars 13131 1726867220.46828: starting attempt loop 13131 1726867220.46839: running the handler 13131 1726867220.46922: variable '__network_connections_result' from source: set_fact 13131 1726867220.46973: handler run complete 13131 1726867220.46987: attempt loop complete, returning result 13131 1726867220.46990: _execute() done 13131 1726867220.46992: dumping result to json 13131 1726867220.46995: done dumping result, returning 13131 1726867220.47006: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-5f24-9b7a-0000000000e1] 13131 1726867220.47008: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e1 13131 1726867220.47087: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e1 13131 1726867220.47090: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)" ] } 13131 1726867220.47157: no more pending results, returning what we have 13131 1726867220.47160: results queue empty 13131 1726867220.47161: checking for any_errors_fatal 13131 1726867220.47165: done checking for any_errors_fatal 13131 1726867220.47166: checking for max_fail_percentage 13131 1726867220.47167: done checking for max_fail_percentage 13131 1726867220.47168: checking to see if all hosts have failed and the running result is not ok 13131 1726867220.47169: done checking to see if all hosts have failed 13131 1726867220.47169: getting the remaining hosts for this loop 13131 1726867220.47171: done getting the remaining hosts for this loop 13131 1726867220.47173: getting the next task for host managed_node1 13131 1726867220.47181: done getting next task for host managed_node1 13131 1726867220.47184: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867220.47186: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867220.47197: getting variables 13131 1726867220.47200: in VariableManager get_vars() 13131 1726867220.47240: Calling all_inventory to load vars for managed_node1 13131 1726867220.47242: Calling groups_inventory to load vars for managed_node1 13131 1726867220.47244: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867220.47252: Calling all_plugins_play to load vars for managed_node1 13131 1726867220.47255: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867220.47257: Calling groups_plugins_play to load vars for managed_node1 13131 1726867220.47972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867220.48847: done with get_vars() 13131 1726867220.48861: done getting variables 13131 1726867220.48903: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:20 -0400 (0:00:00.034) 0:00:35.599 ****** 13131 1726867220.48931: entering _queue_task() for managed_node1/debug 13131 1726867220.49118: worker is 1 (out of 1 available) 13131 1726867220.49132: exiting _queue_task() for managed_node1/debug 13131 1726867220.49142: done queuing things up, now waiting for results queue to drain 13131 1726867220.49143: waiting for pending results... 13131 1726867220.49300: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867220.49384: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000e2 13131 1726867220.49395: variable 'ansible_search_path' from source: unknown 13131 1726867220.49399: variable 'ansible_search_path' from source: unknown 13131 1726867220.49426: calling self._execute() 13131 1726867220.49496: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.49500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.49509: variable 'omit' from source: magic vars 13131 1726867220.49765: variable 'ansible_distribution_major_version' from source: facts 13131 1726867220.49774: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867220.49781: variable 'omit' from source: magic vars 13131 1726867220.49823: variable 'omit' from source: magic vars 13131 1726867220.49848: variable 'omit' from source: magic vars 13131 1726867220.49879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867220.49906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867220.49921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867220.49934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867220.49944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867220.49965: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867220.49968: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.49971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.50039: Set connection var ansible_connection to ssh 13131 1726867220.50046: Set connection var ansible_timeout to 10 13131 1726867220.50049: Set connection var ansible_shell_type to sh 13131 1726867220.50056: Set connection var ansible_shell_executable to /bin/sh 13131 1726867220.50064: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867220.50068: Set connection var ansible_pipelining to False 13131 1726867220.50085: variable 'ansible_shell_executable' from source: unknown 13131 1726867220.50088: variable 'ansible_connection' from source: unknown 13131 1726867220.50091: variable 'ansible_module_compression' from source: unknown 13131 1726867220.50094: variable 'ansible_shell_type' from source: unknown 13131 1726867220.50096: variable 'ansible_shell_executable' from source: unknown 13131 1726867220.50099: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.50104: variable 'ansible_pipelining' from source: unknown 13131 1726867220.50106: variable 'ansible_timeout' from source: unknown 13131 1726867220.50108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.50204: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867220.50209: variable 'omit' from source: magic vars 13131 1726867220.50216: starting attempt loop 13131 1726867220.50219: running the handler 13131 1726867220.50257: variable '__network_connections_result' from source: set_fact 13131 1726867220.50482: variable '__network_connections_result' from source: set_fact 13131 1726867220.50490: handler run complete 13131 1726867220.50524: attempt loop complete, returning result 13131 1726867220.50532: _execute() done 13131 1726867220.50540: dumping result to json 13131 1726867220.50550: done dumping result, returning 13131 1726867220.50562: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-5f24-9b7a-0000000000e2] 13131 1726867220.50570: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e2 13131 1726867220.50687: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e2 13131 1726867220.50695: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 69e7ee46-007a-470e-9bdc-4928b4af57bb (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ca2e10a6-bdb2-4703-8f7f-0fc9be649723 (not-active)" ] } } 13131 1726867220.50843: no more pending results, returning what we have 13131 1726867220.50846: results queue empty 13131 1726867220.50852: checking for any_errors_fatal 13131 1726867220.50858: done checking for any_errors_fatal 13131 1726867220.50859: checking for max_fail_percentage 13131 1726867220.50860: done checking for max_fail_percentage 13131 1726867220.50861: checking to see if all hosts have failed and the running result is not ok 13131 1726867220.50862: done checking to see if all hosts have failed 13131 1726867220.50863: getting the remaining hosts for this loop 13131 1726867220.50864: done getting the remaining hosts for this loop 13131 1726867220.50866: getting the next task for host managed_node1 13131 1726867220.50871: done getting next task for host managed_node1 13131 1726867220.50874: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867220.50876: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867220.50888: getting variables 13131 1726867220.50889: in VariableManager get_vars() 13131 1726867220.50936: Calling all_inventory to load vars for managed_node1 13131 1726867220.50940: Calling groups_inventory to load vars for managed_node1 13131 1726867220.50942: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867220.50949: Calling all_plugins_play to load vars for managed_node1 13131 1726867220.50952: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867220.50954: Calling groups_plugins_play to load vars for managed_node1 13131 1726867220.52472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867220.54172: done with get_vars() 13131 1726867220.54205: done getting variables 13131 1726867220.54261: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:20 -0400 (0:00:00.053) 0:00:35.653 ****** 13131 1726867220.54296: entering _queue_task() for managed_node1/debug 13131 1726867220.54639: worker is 1 (out of 1 available) 13131 1726867220.54651: exiting _queue_task() for managed_node1/debug 13131 1726867220.54661: done queuing things up, now waiting for results queue to drain 13131 1726867220.54662: waiting for pending results... 13131 1726867220.54898: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867220.54993: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000e3 13131 1726867220.55006: variable 'ansible_search_path' from source: unknown 13131 1726867220.55009: variable 'ansible_search_path' from source: unknown 13131 1726867220.55038: calling self._execute() 13131 1726867220.55114: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.55118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.55126: variable 'omit' from source: magic vars 13131 1726867220.55403: variable 'ansible_distribution_major_version' from source: facts 13131 1726867220.55411: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867220.55492: variable 'network_state' from source: role '' defaults 13131 1726867220.55500: Evaluated conditional (network_state != {}): False 13131 1726867220.55504: when evaluation is False, skipping this task 13131 1726867220.55507: _execute() done 13131 1726867220.55510: dumping result to json 13131 1726867220.55519: done dumping result, returning 13131 1726867220.55522: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-5f24-9b7a-0000000000e3] 13131 1726867220.55527: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e3 13131 1726867220.55611: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e3 13131 1726867220.55614: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 13131 1726867220.55656: no more pending results, returning what we have 13131 1726867220.55659: results queue empty 13131 1726867220.55660: checking for any_errors_fatal 13131 1726867220.55668: done checking for any_errors_fatal 13131 1726867220.55669: checking for max_fail_percentage 13131 1726867220.55671: done checking for max_fail_percentage 13131 1726867220.55672: checking to see if all hosts have failed and the running result is not ok 13131 1726867220.55672: done checking to see if all hosts have failed 13131 1726867220.55673: getting the remaining hosts for this loop 13131 1726867220.55674: done getting the remaining hosts for this loop 13131 1726867220.55679: getting the next task for host managed_node1 13131 1726867220.55684: done getting next task for host managed_node1 13131 1726867220.55687: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867220.55690: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867220.55705: getting variables 13131 1726867220.55707: in VariableManager get_vars() 13131 1726867220.55746: Calling all_inventory to load vars for managed_node1 13131 1726867220.55748: Calling groups_inventory to load vars for managed_node1 13131 1726867220.55750: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867220.55758: Calling all_plugins_play to load vars for managed_node1 13131 1726867220.55760: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867220.55762: Calling groups_plugins_play to load vars for managed_node1 13131 1726867220.56485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867220.57346: done with get_vars() 13131 1726867220.57360: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:20 -0400 (0:00:00.031) 0:00:35.684 ****** 13131 1726867220.57423: entering _queue_task() for managed_node1/ping 13131 1726867220.57604: worker is 1 (out of 1 available) 13131 1726867220.57616: exiting _queue_task() for managed_node1/ping 13131 1726867220.57626: done queuing things up, now waiting for results queue to drain 13131 1726867220.57628: waiting for pending results... 13131 1726867220.57795: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867220.57880: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000000e4 13131 1726867220.57893: variable 'ansible_search_path' from source: unknown 13131 1726867220.57896: variable 'ansible_search_path' from source: unknown 13131 1726867220.57927: calling self._execute() 13131 1726867220.58000: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.58007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.58017: variable 'omit' from source: magic vars 13131 1726867220.58274: variable 'ansible_distribution_major_version' from source: facts 13131 1726867220.58285: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867220.58288: variable 'omit' from source: magic vars 13131 1726867220.58329: variable 'omit' from source: magic vars 13131 1726867220.58352: variable 'omit' from source: magic vars 13131 1726867220.58382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867220.58411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867220.58428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867220.58440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867220.58450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867220.58473: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867220.58476: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.58481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.58549: Set connection var ansible_connection to ssh 13131 1726867220.58556: Set connection var ansible_timeout to 10 13131 1726867220.58558: Set connection var ansible_shell_type to sh 13131 1726867220.58565: Set connection var ansible_shell_executable to /bin/sh 13131 1726867220.58573: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867220.58579: Set connection var ansible_pipelining to False 13131 1726867220.58597: variable 'ansible_shell_executable' from source: unknown 13131 1726867220.58600: variable 'ansible_connection' from source: unknown 13131 1726867220.58602: variable 'ansible_module_compression' from source: unknown 13131 1726867220.58607: variable 'ansible_shell_type' from source: unknown 13131 1726867220.58609: variable 'ansible_shell_executable' from source: unknown 13131 1726867220.58612: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.58623: variable 'ansible_pipelining' from source: unknown 13131 1726867220.58625: variable 'ansible_timeout' from source: unknown 13131 1726867220.58627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.58762: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867220.58769: variable 'omit' from source: magic vars 13131 1726867220.58774: starting attempt loop 13131 1726867220.58778: running the handler 13131 1726867220.58792: _low_level_execute_command(): starting 13131 1726867220.58798: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867220.59299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867220.59306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867220.59310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.59357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867220.59360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867220.59362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867220.59425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867220.61087: stdout chunk (state=3): >>>/root <<< 13131 1726867220.61186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867220.61212: stderr chunk (state=3): >>><<< 13131 1726867220.61215: stdout chunk (state=3): >>><<< 13131 1726867220.61235: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867220.61251: _low_level_execute_command(): starting 13131 1726867220.61255: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285 `" && echo ansible-tmp-1726867220.6123576-14854-263763354508285="` echo /root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285 `" ) && sleep 0' 13131 1726867220.61667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867220.61672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.61675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867220.61685: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867220.61688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.61730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867220.61737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867220.61738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867220.61782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867220.63653: stdout chunk (state=3): >>>ansible-tmp-1726867220.6123576-14854-263763354508285=/root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285 <<< 13131 1726867220.63763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867220.63786: stderr chunk (state=3): >>><<< 13131 1726867220.63789: stdout chunk (state=3): >>><<< 13131 1726867220.63803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867220.6123576-14854-263763354508285=/root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867220.63834: variable 'ansible_module_compression' from source: unknown 13131 1726867220.63867: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13131 1726867220.63898: variable 'ansible_facts' from source: unknown 13131 1726867220.63952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/AnsiballZ_ping.py 13131 1726867220.64048: Sending initial data 13131 1726867220.64051: Sent initial data (153 bytes) 13131 1726867220.64444: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867220.64449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.64460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.64508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867220.64525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867220.64573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867220.66094: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867220.66099: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867220.66138: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867220.66185: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp0446srkv /root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/AnsiballZ_ping.py <<< 13131 1726867220.66192: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/AnsiballZ_ping.py" <<< 13131 1726867220.66232: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp0446srkv" to remote "/root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/AnsiballZ_ping.py" <<< 13131 1726867220.66743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867220.66778: stderr chunk (state=3): >>><<< 13131 1726867220.66782: stdout chunk (state=3): >>><<< 13131 1726867220.66813: done transferring module to remote 13131 1726867220.66820: _low_level_execute_command(): starting 13131 1726867220.66824: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/ /root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/AnsiballZ_ping.py && sleep 0' 13131 1726867220.67211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867220.67215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.67232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.67283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867220.67286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867220.67340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867220.69064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867220.69083: stderr chunk (state=3): >>><<< 13131 1726867220.69086: stdout chunk (state=3): >>><<< 13131 1726867220.69097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867220.69100: _low_level_execute_command(): starting 13131 1726867220.69107: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/AnsiballZ_ping.py && sleep 0' 13131 1726867220.69496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867220.69499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.69504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867220.69506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867220.69508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.69553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867220.69559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867220.69608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867220.84466: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13131 1726867220.85824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867220.85828: stdout chunk (state=3): >>><<< 13131 1726867220.85831: stderr chunk (state=3): >>><<< 13131 1726867220.85850: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867220.85885: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867220.85989: _low_level_execute_command(): starting 13131 1726867220.85992: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867220.6123576-14854-263763354508285/ > /dev/null 2>&1 && sleep 0' 13131 1726867220.86897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867220.86904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867220.86924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867220.86947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867220.87024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867220.88898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867220.88915: stdout chunk (state=3): >>><<< 13131 1726867220.88926: stderr chunk (state=3): >>><<< 13131 1726867220.89083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867220.89091: handler run complete 13131 1726867220.89093: attempt loop complete, returning result 13131 1726867220.89096: _execute() done 13131 1726867220.89098: dumping result to json 13131 1726867220.89100: done dumping result, returning 13131 1726867220.89102: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-5f24-9b7a-0000000000e4] 13131 1726867220.89104: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e4 13131 1726867220.89175: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000000e4 13131 1726867220.89180: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 13131 1726867220.89281: no more pending results, returning what we have 13131 1726867220.89285: results queue empty 13131 1726867220.89286: checking for any_errors_fatal 13131 1726867220.89293: done checking for any_errors_fatal 13131 1726867220.89294: checking for max_fail_percentage 13131 1726867220.89296: done checking for max_fail_percentage 13131 1726867220.89297: checking to see if all hosts have failed and the running result is not ok 13131 1726867220.89297: done checking to see if all hosts have failed 13131 1726867220.89298: getting the remaining hosts for this loop 13131 1726867220.89300: done getting the remaining hosts for this loop 13131 1726867220.89303: getting the next task for host managed_node1 13131 1726867220.89321: done getting next task for host managed_node1 13131 1726867220.89324: ^ task is: TASK: meta (role_complete) 13131 1726867220.89328: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867220.89342: getting variables 13131 1726867220.89344: in VariableManager get_vars() 13131 1726867220.89498: Calling all_inventory to load vars for managed_node1 13131 1726867220.89502: Calling groups_inventory to load vars for managed_node1 13131 1726867220.89504: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867220.89515: Calling all_plugins_play to load vars for managed_node1 13131 1726867220.89518: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867220.89521: Calling groups_plugins_play to load vars for managed_node1 13131 1726867220.91209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867220.92820: done with get_vars() 13131 1726867220.92841: done getting variables 13131 1726867220.92926: done queuing things up, now waiting for results queue to drain 13131 1726867220.92928: results queue empty 13131 1726867220.92929: checking for any_errors_fatal 13131 1726867220.92932: done checking for any_errors_fatal 13131 1726867220.92932: checking for max_fail_percentage 13131 1726867220.92933: done checking for max_fail_percentage 13131 1726867220.92934: checking to see if all hosts have failed and the running result is not ok 13131 1726867220.92935: done checking to see if all hosts have failed 13131 1726867220.92935: getting the remaining hosts for this loop 13131 1726867220.92936: done getting the remaining hosts for this loop 13131 1726867220.92939: getting the next task for host managed_node1 13131 1726867220.92944: done getting next task for host managed_node1 13131 1726867220.92947: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867220.92949: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867220.92958: getting variables 13131 1726867220.92959: in VariableManager get_vars() 13131 1726867220.92983: Calling all_inventory to load vars for managed_node1 13131 1726867220.92985: Calling groups_inventory to load vars for managed_node1 13131 1726867220.92987: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867220.92991: Calling all_plugins_play to load vars for managed_node1 13131 1726867220.92993: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867220.92996: Calling groups_plugins_play to load vars for managed_node1 13131 1726867220.94117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867220.95795: done with get_vars() 13131 1726867220.95816: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:20:20 -0400 (0:00:00.384) 0:00:36.069 ****** 13131 1726867220.95895: entering _queue_task() for managed_node1/include_tasks 13131 1726867220.96220: worker is 1 (out of 1 available) 13131 1726867220.96231: exiting _queue_task() for managed_node1/include_tasks 13131 1726867220.96243: done queuing things up, now waiting for results queue to drain 13131 1726867220.96244: waiting for pending results... 13131 1726867220.96697: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867220.96705: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000011b 13131 1726867220.96709: variable 'ansible_search_path' from source: unknown 13131 1726867220.96713: variable 'ansible_search_path' from source: unknown 13131 1726867220.96737: calling self._execute() 13131 1726867220.96850: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867220.96863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867220.96880: variable 'omit' from source: magic vars 13131 1726867220.97268: variable 'ansible_distribution_major_version' from source: facts 13131 1726867220.97372: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867220.97376: _execute() done 13131 1726867220.97380: dumping result to json 13131 1726867220.97382: done dumping result, returning 13131 1726867220.97385: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-5f24-9b7a-00000000011b] 13131 1726867220.97388: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011b 13131 1726867220.97459: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011b 13131 1726867220.97463: WORKER PROCESS EXITING 13131 1726867220.97522: no more pending results, returning what we have 13131 1726867220.97527: in VariableManager get_vars() 13131 1726867220.97587: Calling all_inventory to load vars for managed_node1 13131 1726867220.97590: Calling groups_inventory to load vars for managed_node1 13131 1726867220.97593: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867220.97608: Calling all_plugins_play to load vars for managed_node1 13131 1726867220.97612: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867220.97615: Calling groups_plugins_play to load vars for managed_node1 13131 1726867220.99112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867221.00633: done with get_vars() 13131 1726867221.00653: variable 'ansible_search_path' from source: unknown 13131 1726867221.00654: variable 'ansible_search_path' from source: unknown 13131 1726867221.00693: we have included files to process 13131 1726867221.00694: generating all_blocks data 13131 1726867221.00696: done generating all_blocks data 13131 1726867221.00705: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867221.00706: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867221.00708: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867221.01269: done processing included file 13131 1726867221.01272: iterating over new_blocks loaded from include file 13131 1726867221.01273: in VariableManager get_vars() 13131 1726867221.01309: done with get_vars() 13131 1726867221.01310: filtering new block on tags 13131 1726867221.01328: done filtering new block on tags 13131 1726867221.01331: in VariableManager get_vars() 13131 1726867221.01359: done with get_vars() 13131 1726867221.01361: filtering new block on tags 13131 1726867221.01383: done filtering new block on tags 13131 1726867221.01386: in VariableManager get_vars() 13131 1726867221.01418: done with get_vars() 13131 1726867221.01419: filtering new block on tags 13131 1726867221.01437: done filtering new block on tags 13131 1726867221.01439: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 13131 1726867221.01444: extending task lists for all hosts with included blocks 13131 1726867221.02229: done extending task lists 13131 1726867221.02231: done processing included files 13131 1726867221.02232: results queue empty 13131 1726867221.02232: checking for any_errors_fatal 13131 1726867221.02234: done checking for any_errors_fatal 13131 1726867221.02234: checking for max_fail_percentage 13131 1726867221.02235: done checking for max_fail_percentage 13131 1726867221.02236: checking to see if all hosts have failed and the running result is not ok 13131 1726867221.02237: done checking to see if all hosts have failed 13131 1726867221.02238: getting the remaining hosts for this loop 13131 1726867221.02239: done getting the remaining hosts for this loop 13131 1726867221.02241: getting the next task for host managed_node1 13131 1726867221.02245: done getting next task for host managed_node1 13131 1726867221.02247: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867221.02250: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867221.02260: getting variables 13131 1726867221.02261: in VariableManager get_vars() 13131 1726867221.02282: Calling all_inventory to load vars for managed_node1 13131 1726867221.02284: Calling groups_inventory to load vars for managed_node1 13131 1726867221.02286: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867221.02291: Calling all_plugins_play to load vars for managed_node1 13131 1726867221.02294: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867221.02296: Calling groups_plugins_play to load vars for managed_node1 13131 1726867221.03534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867221.05057: done with get_vars() 13131 1726867221.05083: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:20:21 -0400 (0:00:00.092) 0:00:36.162 ****** 13131 1726867221.05166: entering _queue_task() for managed_node1/setup 13131 1726867221.05540: worker is 1 (out of 1 available) 13131 1726867221.05555: exiting _queue_task() for managed_node1/setup 13131 1726867221.05569: done queuing things up, now waiting for results queue to drain 13131 1726867221.05570: waiting for pending results... 13131 1726867221.05863: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867221.06068: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000084f 13131 1726867221.06073: variable 'ansible_search_path' from source: unknown 13131 1726867221.06075: variable 'ansible_search_path' from source: unknown 13131 1726867221.06112: calling self._execute() 13131 1726867221.06225: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867221.06286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867221.06290: variable 'omit' from source: magic vars 13131 1726867221.06658: variable 'ansible_distribution_major_version' from source: facts 13131 1726867221.06674: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867221.06885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867221.09255: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867221.09317: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867221.09368: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867221.09410: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867221.09439: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867221.09541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867221.09597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867221.09642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867221.09707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867221.09728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867221.09797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867221.09893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867221.09896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867221.09900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867221.09920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867221.10124: variable '__network_required_facts' from source: role '' defaults 13131 1726867221.10134: variable 'ansible_facts' from source: unknown 13131 1726867221.11161: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13131 1726867221.11171: when evaluation is False, skipping this task 13131 1726867221.11183: _execute() done 13131 1726867221.11191: dumping result to json 13131 1726867221.11198: done dumping result, returning 13131 1726867221.11210: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-5f24-9b7a-00000000084f] 13131 1726867221.11218: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000084f skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867221.11451: no more pending results, returning what we have 13131 1726867221.11455: results queue empty 13131 1726867221.11456: checking for any_errors_fatal 13131 1726867221.11457: done checking for any_errors_fatal 13131 1726867221.11458: checking for max_fail_percentage 13131 1726867221.11459: done checking for max_fail_percentage 13131 1726867221.11460: checking to see if all hosts have failed and the running result is not ok 13131 1726867221.11461: done checking to see if all hosts have failed 13131 1726867221.11462: getting the remaining hosts for this loop 13131 1726867221.11463: done getting the remaining hosts for this loop 13131 1726867221.11466: getting the next task for host managed_node1 13131 1726867221.11475: done getting next task for host managed_node1 13131 1726867221.11480: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867221.11484: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867221.11506: getting variables 13131 1726867221.11507: in VariableManager get_vars() 13131 1726867221.11557: Calling all_inventory to load vars for managed_node1 13131 1726867221.11560: Calling groups_inventory to load vars for managed_node1 13131 1726867221.11562: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867221.11571: Calling all_plugins_play to load vars for managed_node1 13131 1726867221.11573: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867221.11575: Calling groups_plugins_play to load vars for managed_node1 13131 1726867221.12194: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000084f 13131 1726867221.12199: WORKER PROCESS EXITING 13131 1726867221.13006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867221.14590: done with get_vars() 13131 1726867221.14616: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:20:21 -0400 (0:00:00.095) 0:00:36.257 ****** 13131 1726867221.14720: entering _queue_task() for managed_node1/stat 13131 1726867221.15020: worker is 1 (out of 1 available) 13131 1726867221.15033: exiting _queue_task() for managed_node1/stat 13131 1726867221.15044: done queuing things up, now waiting for results queue to drain 13131 1726867221.15046: waiting for pending results... 13131 1726867221.15351: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867221.15494: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000851 13131 1726867221.15521: variable 'ansible_search_path' from source: unknown 13131 1726867221.15556: variable 'ansible_search_path' from source: unknown 13131 1726867221.15573: calling self._execute() 13131 1726867221.15686: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867221.15700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867221.15738: variable 'omit' from source: magic vars 13131 1726867221.16106: variable 'ansible_distribution_major_version' from source: facts 13131 1726867221.16125: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867221.16301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867221.16608: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867221.16651: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867221.16717: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867221.16735: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867221.16826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867221.16861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867221.16933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867221.16936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867221.17006: variable '__network_is_ostree' from source: set_fact 13131 1726867221.17016: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867221.17023: when evaluation is False, skipping this task 13131 1726867221.17031: _execute() done 13131 1726867221.17044: dumping result to json 13131 1726867221.17051: done dumping result, returning 13131 1726867221.17061: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-5f24-9b7a-000000000851] 13131 1726867221.17068: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000851 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867221.17213: no more pending results, returning what we have 13131 1726867221.17217: results queue empty 13131 1726867221.17218: checking for any_errors_fatal 13131 1726867221.17225: done checking for any_errors_fatal 13131 1726867221.17226: checking for max_fail_percentage 13131 1726867221.17227: done checking for max_fail_percentage 13131 1726867221.17228: checking to see if all hosts have failed and the running result is not ok 13131 1726867221.17229: done checking to see if all hosts have failed 13131 1726867221.17229: getting the remaining hosts for this loop 13131 1726867221.17231: done getting the remaining hosts for this loop 13131 1726867221.17234: getting the next task for host managed_node1 13131 1726867221.17240: done getting next task for host managed_node1 13131 1726867221.17243: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867221.17248: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867221.17268: getting variables 13131 1726867221.17270: in VariableManager get_vars() 13131 1726867221.17420: Calling all_inventory to load vars for managed_node1 13131 1726867221.17423: Calling groups_inventory to load vars for managed_node1 13131 1726867221.17425: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867221.17434: Calling all_plugins_play to load vars for managed_node1 13131 1726867221.17436: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867221.17439: Calling groups_plugins_play to load vars for managed_node1 13131 1726867221.17995: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000851 13131 1726867221.17998: WORKER PROCESS EXITING 13131 1726867221.19149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867221.20766: done with get_vars() 13131 1726867221.20790: done getting variables 13131 1726867221.20852: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:20:21 -0400 (0:00:00.061) 0:00:36.319 ****** 13131 1726867221.20889: entering _queue_task() for managed_node1/set_fact 13131 1726867221.21218: worker is 1 (out of 1 available) 13131 1726867221.21228: exiting _queue_task() for managed_node1/set_fact 13131 1726867221.21238: done queuing things up, now waiting for results queue to drain 13131 1726867221.21239: waiting for pending results... 13131 1726867221.21543: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867221.21708: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000852 13131 1726867221.21729: variable 'ansible_search_path' from source: unknown 13131 1726867221.21736: variable 'ansible_search_path' from source: unknown 13131 1726867221.21775: calling self._execute() 13131 1726867221.21885: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867221.21898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867221.21917: variable 'omit' from source: magic vars 13131 1726867221.22304: variable 'ansible_distribution_major_version' from source: facts 13131 1726867221.22320: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867221.22490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867221.22793: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867221.22843: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867221.22890: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867221.22934: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867221.23034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867221.23064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867221.23099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867221.23184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867221.23242: variable '__network_is_ostree' from source: set_fact 13131 1726867221.23256: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867221.23265: when evaluation is False, skipping this task 13131 1726867221.23272: _execute() done 13131 1726867221.23281: dumping result to json 13131 1726867221.23289: done dumping result, returning 13131 1726867221.23302: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-5f24-9b7a-000000000852] 13131 1726867221.23311: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000852 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867221.23632: no more pending results, returning what we have 13131 1726867221.23636: results queue empty 13131 1726867221.23638: checking for any_errors_fatal 13131 1726867221.23645: done checking for any_errors_fatal 13131 1726867221.23647: checking for max_fail_percentage 13131 1726867221.23649: done checking for max_fail_percentage 13131 1726867221.23650: checking to see if all hosts have failed and the running result is not ok 13131 1726867221.23651: done checking to see if all hosts have failed 13131 1726867221.23651: getting the remaining hosts for this loop 13131 1726867221.23653: done getting the remaining hosts for this loop 13131 1726867221.23657: getting the next task for host managed_node1 13131 1726867221.23668: done getting next task for host managed_node1 13131 1726867221.23672: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867221.23676: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867221.23704: getting variables 13131 1726867221.23706: in VariableManager get_vars() 13131 1726867221.23761: Calling all_inventory to load vars for managed_node1 13131 1726867221.23764: Calling groups_inventory to load vars for managed_node1 13131 1726867221.23767: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867221.23913: Calling all_plugins_play to load vars for managed_node1 13131 1726867221.23918: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867221.23924: Calling groups_plugins_play to load vars for managed_node1 13131 1726867221.24590: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000852 13131 1726867221.24594: WORKER PROCESS EXITING 13131 1726867221.25350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867221.27072: done with get_vars() 13131 1726867221.27095: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:20:21 -0400 (0:00:00.063) 0:00:36.382 ****** 13131 1726867221.27201: entering _queue_task() for managed_node1/service_facts 13131 1726867221.27560: worker is 1 (out of 1 available) 13131 1726867221.27573: exiting _queue_task() for managed_node1/service_facts 13131 1726867221.27591: done queuing things up, now waiting for results queue to drain 13131 1726867221.27592: waiting for pending results... 13131 1726867221.27895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867221.28082: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000854 13131 1726867221.28108: variable 'ansible_search_path' from source: unknown 13131 1726867221.28121: variable 'ansible_search_path' from source: unknown 13131 1726867221.28162: calling self._execute() 13131 1726867221.28282: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867221.28296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867221.28323: variable 'omit' from source: magic vars 13131 1726867221.29025: variable 'ansible_distribution_major_version' from source: facts 13131 1726867221.29282: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867221.29286: variable 'omit' from source: magic vars 13131 1726867221.29288: variable 'omit' from source: magic vars 13131 1726867221.29290: variable 'omit' from source: magic vars 13131 1726867221.29292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867221.29294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867221.29296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867221.29299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867221.29301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867221.29333: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867221.29341: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867221.29348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867221.29457: Set connection var ansible_connection to ssh 13131 1726867221.29470: Set connection var ansible_timeout to 10 13131 1726867221.29476: Set connection var ansible_shell_type to sh 13131 1726867221.29490: Set connection var ansible_shell_executable to /bin/sh 13131 1726867221.29503: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867221.29513: Set connection var ansible_pipelining to False 13131 1726867221.29544: variable 'ansible_shell_executable' from source: unknown 13131 1726867221.29552: variable 'ansible_connection' from source: unknown 13131 1726867221.29558: variable 'ansible_module_compression' from source: unknown 13131 1726867221.29564: variable 'ansible_shell_type' from source: unknown 13131 1726867221.29570: variable 'ansible_shell_executable' from source: unknown 13131 1726867221.29576: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867221.29586: variable 'ansible_pipelining' from source: unknown 13131 1726867221.29592: variable 'ansible_timeout' from source: unknown 13131 1726867221.29599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867221.29798: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867221.29817: variable 'omit' from source: magic vars 13131 1726867221.29828: starting attempt loop 13131 1726867221.29834: running the handler 13131 1726867221.29858: _low_level_execute_command(): starting 13131 1726867221.29869: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867221.30596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867221.30628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867221.30691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867221.30745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867221.30761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867221.30786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867221.30871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867221.32539: stdout chunk (state=3): >>>/root <<< 13131 1726867221.32675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867221.32695: stdout chunk (state=3): >>><<< 13131 1726867221.32709: stderr chunk (state=3): >>><<< 13131 1726867221.32737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867221.32761: _low_level_execute_command(): starting 13131 1726867221.32772: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461 `" && echo ansible-tmp-1726867221.3274581-14874-52309094673461="` echo /root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461 `" ) && sleep 0' 13131 1726867221.33498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867221.33568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867221.33599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867221.33603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867221.33691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867221.35565: stdout chunk (state=3): >>>ansible-tmp-1726867221.3274581-14874-52309094673461=/root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461 <<< 13131 1726867221.35730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867221.35734: stdout chunk (state=3): >>><<< 13131 1726867221.35737: stderr chunk (state=3): >>><<< 13131 1726867221.35883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867221.3274581-14874-52309094673461=/root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867221.35886: variable 'ansible_module_compression' from source: unknown 13131 1726867221.35889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13131 1726867221.35911: variable 'ansible_facts' from source: unknown 13131 1726867221.36017: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/AnsiballZ_service_facts.py 13131 1726867221.36244: Sending initial data 13131 1726867221.36247: Sent initial data (161 bytes) 13131 1726867221.36839: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867221.36893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867221.36910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867221.36992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867221.37019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867221.37098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867221.38665: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867221.38710: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867221.38773: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpnmcdm40f /root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/AnsiballZ_service_facts.py <<< 13131 1726867221.38785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/AnsiballZ_service_facts.py" <<< 13131 1726867221.38829: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpnmcdm40f" to remote "/root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/AnsiballZ_service_facts.py" <<< 13131 1726867221.39627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867221.39801: stderr chunk (state=3): >>><<< 13131 1726867221.39804: stdout chunk (state=3): >>><<< 13131 1726867221.39807: done transferring module to remote 13131 1726867221.39809: _low_level_execute_command(): starting 13131 1726867221.39811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/ /root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/AnsiballZ_service_facts.py && sleep 0' 13131 1726867221.40423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867221.40440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867221.40456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867221.40543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867221.40589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867221.40608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867221.40632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867221.40724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867221.42496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867221.42556: stderr chunk (state=3): >>><<< 13131 1726867221.42559: stdout chunk (state=3): >>><<< 13131 1726867221.42653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867221.42657: _low_level_execute_command(): starting 13131 1726867221.42660: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/AnsiballZ_service_facts.py && sleep 0' 13131 1726867221.43196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867221.43215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867221.43234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867221.43295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867221.43361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867221.43383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867221.43414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867221.43492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867222.96175: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 13131 1726867222.96256: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "<<< 13131 1726867222.96275: stdout chunk (state=3): >>>systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13131 1726867222.97782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867222.97786: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 13131 1726867222.97788: stdout chunk (state=3): >>><<< 13131 1726867222.97790: stderr chunk (state=3): >>><<< 13131 1726867222.97818: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867222.98608: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867222.98618: _low_level_execute_command(): starting 13131 1726867222.98623: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867221.3274581-14874-52309094673461/ > /dev/null 2>&1 && sleep 0' 13131 1726867222.99265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867222.99292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867222.99419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867222.99422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867222.99473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867223.01482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867223.01485: stderr chunk (state=3): >>><<< 13131 1726867223.01488: stdout chunk (state=3): >>><<< 13131 1726867223.01490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867223.01492: handler run complete 13131 1726867223.01533: variable 'ansible_facts' from source: unknown 13131 1726867223.01697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867223.02221: variable 'ansible_facts' from source: unknown 13131 1726867223.02363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867223.02589: attempt loop complete, returning result 13131 1726867223.02593: _execute() done 13131 1726867223.02597: dumping result to json 13131 1726867223.02669: done dumping result, returning 13131 1726867223.02679: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-5f24-9b7a-000000000854] 13131 1726867223.02683: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000854 13131 1726867223.03524: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000854 13131 1726867223.03527: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867223.03609: no more pending results, returning what we have 13131 1726867223.03612: results queue empty 13131 1726867223.03612: checking for any_errors_fatal 13131 1726867223.03617: done checking for any_errors_fatal 13131 1726867223.03618: checking for max_fail_percentage 13131 1726867223.03619: done checking for max_fail_percentage 13131 1726867223.03620: checking to see if all hosts have failed and the running result is not ok 13131 1726867223.03621: done checking to see if all hosts have failed 13131 1726867223.03622: getting the remaining hosts for this loop 13131 1726867223.03623: done getting the remaining hosts for this loop 13131 1726867223.03626: getting the next task for host managed_node1 13131 1726867223.03632: done getting next task for host managed_node1 13131 1726867223.03636: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867223.03640: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867223.03653: getting variables 13131 1726867223.03655: in VariableManager get_vars() 13131 1726867223.03718: Calling all_inventory to load vars for managed_node1 13131 1726867223.03721: Calling groups_inventory to load vars for managed_node1 13131 1726867223.03724: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867223.03733: Calling all_plugins_play to load vars for managed_node1 13131 1726867223.03735: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867223.03738: Calling groups_plugins_play to load vars for managed_node1 13131 1726867223.05234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867223.06925: done with get_vars() 13131 1726867223.06946: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:20:23 -0400 (0:00:01.798) 0:00:38.181 ****** 13131 1726867223.07046: entering _queue_task() for managed_node1/package_facts 13131 1726867223.07355: worker is 1 (out of 1 available) 13131 1726867223.07367: exiting _queue_task() for managed_node1/package_facts 13131 1726867223.07380: done queuing things up, now waiting for results queue to drain 13131 1726867223.07381: waiting for pending results... 13131 1726867223.07709: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867223.07844: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000855 13131 1726867223.07883: variable 'ansible_search_path' from source: unknown 13131 1726867223.07887: variable 'ansible_search_path' from source: unknown 13131 1726867223.07920: calling self._execute() 13131 1726867223.08131: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867223.08136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867223.08138: variable 'omit' from source: magic vars 13131 1726867223.08434: variable 'ansible_distribution_major_version' from source: facts 13131 1726867223.08455: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867223.08468: variable 'omit' from source: magic vars 13131 1726867223.08542: variable 'omit' from source: magic vars 13131 1726867223.08595: variable 'omit' from source: magic vars 13131 1726867223.08643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867223.08695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867223.08786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867223.08789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867223.08792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867223.08800: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867223.08811: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867223.08819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867223.08932: Set connection var ansible_connection to ssh 13131 1726867223.08947: Set connection var ansible_timeout to 10 13131 1726867223.08954: Set connection var ansible_shell_type to sh 13131 1726867223.08967: Set connection var ansible_shell_executable to /bin/sh 13131 1726867223.08986: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867223.09006: Set connection var ansible_pipelining to False 13131 1726867223.09034: variable 'ansible_shell_executable' from source: unknown 13131 1726867223.09084: variable 'ansible_connection' from source: unknown 13131 1726867223.09087: variable 'ansible_module_compression' from source: unknown 13131 1726867223.09090: variable 'ansible_shell_type' from source: unknown 13131 1726867223.09092: variable 'ansible_shell_executable' from source: unknown 13131 1726867223.09094: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867223.09096: variable 'ansible_pipelining' from source: unknown 13131 1726867223.09098: variable 'ansible_timeout' from source: unknown 13131 1726867223.09100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867223.09299: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867223.09324: variable 'omit' from source: magic vars 13131 1726867223.09434: starting attempt loop 13131 1726867223.09438: running the handler 13131 1726867223.09441: _low_level_execute_command(): starting 13131 1726867223.09443: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867223.10199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867223.10242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867223.10258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867223.10286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867223.10374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867223.12016: stdout chunk (state=3): >>>/root <<< 13131 1726867223.12148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867223.12184: stdout chunk (state=3): >>><<< 13131 1726867223.12187: stderr chunk (state=3): >>><<< 13131 1726867223.12207: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867223.12300: _low_level_execute_command(): starting 13131 1726867223.12308: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987 `" && echo ansible-tmp-1726867223.1221437-14937-133019906890987="` echo /root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987 `" ) && sleep 0' 13131 1726867223.12794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867223.12888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867223.12942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867223.12994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867223.14867: stdout chunk (state=3): >>>ansible-tmp-1726867223.1221437-14937-133019906890987=/root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987 <<< 13131 1726867223.15005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867223.15008: stdout chunk (state=3): >>><<< 13131 1726867223.15010: stderr chunk (state=3): >>><<< 13131 1726867223.15182: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867223.1221437-14937-133019906890987=/root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867223.15186: variable 'ansible_module_compression' from source: unknown 13131 1726867223.15188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13131 1726867223.15190: variable 'ansible_facts' from source: unknown 13131 1726867223.15384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/AnsiballZ_package_facts.py 13131 1726867223.15538: Sending initial data 13131 1726867223.15547: Sent initial data (162 bytes) 13131 1726867223.16154: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867223.16169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867223.16281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867223.16308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867223.16325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867223.16406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867223.17929: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867223.17991: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867223.18063: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp_11o4rqs /root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/AnsiballZ_package_facts.py <<< 13131 1726867223.18079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/AnsiballZ_package_facts.py" <<< 13131 1726867223.18114: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp_11o4rqs" to remote "/root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/AnsiballZ_package_facts.py" <<< 13131 1726867223.19793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867223.19796: stdout chunk (state=3): >>><<< 13131 1726867223.19798: stderr chunk (state=3): >>><<< 13131 1726867223.19800: done transferring module to remote 13131 1726867223.19802: _low_level_execute_command(): starting 13131 1726867223.19804: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/ /root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/AnsiballZ_package_facts.py && sleep 0' 13131 1726867223.20493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867223.20509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867223.20564: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867223.20614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867223.20627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867223.20650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867223.20727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867223.22512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867223.22556: stderr chunk (state=3): >>><<< 13131 1726867223.22572: stdout chunk (state=3): >>><<< 13131 1726867223.22676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867223.22681: _low_level_execute_command(): starting 13131 1726867223.22684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/AnsiballZ_package_facts.py && sleep 0' 13131 1726867223.23237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867223.23256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867223.23348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867223.23374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867223.23450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867223.23513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867223.23571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867223.23632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867223.67583: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 13131 1726867223.67659: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 13131 1726867223.67712: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 13131 1726867223.67728: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13131 1726867223.69406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867223.69432: stderr chunk (state=3): >>><<< 13131 1726867223.69436: stdout chunk (state=3): >>><<< 13131 1726867223.69462: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867223.70728: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867223.70744: _low_level_execute_command(): starting 13131 1726867223.70747: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867223.1221437-14937-133019906890987/ > /dev/null 2>&1 && sleep 0' 13131 1726867223.71179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867223.71183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867223.71185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867223.71188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867223.71190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867223.71249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867223.71252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867223.71295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867223.73117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867223.73137: stderr chunk (state=3): >>><<< 13131 1726867223.73141: stdout chunk (state=3): >>><<< 13131 1726867223.73152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867223.73157: handler run complete 13131 1726867223.73609: variable 'ansible_facts' from source: unknown 13131 1726867223.73867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867223.75382: variable 'ansible_facts' from source: unknown 13131 1726867223.75678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867223.76434: attempt loop complete, returning result 13131 1726867223.76445: _execute() done 13131 1726867223.76449: dumping result to json 13131 1726867223.76662: done dumping result, returning 13131 1726867223.76669: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-5f24-9b7a-000000000855] 13131 1726867223.76673: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000855 13131 1726867223.78261: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000855 13131 1726867223.78264: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867223.78533: no more pending results, returning what we have 13131 1726867223.78536: results queue empty 13131 1726867223.78537: checking for any_errors_fatal 13131 1726867223.78543: done checking for any_errors_fatal 13131 1726867223.78543: checking for max_fail_percentage 13131 1726867223.78545: done checking for max_fail_percentage 13131 1726867223.78546: checking to see if all hosts have failed and the running result is not ok 13131 1726867223.78547: done checking to see if all hosts have failed 13131 1726867223.78547: getting the remaining hosts for this loop 13131 1726867223.78548: done getting the remaining hosts for this loop 13131 1726867223.78552: getting the next task for host managed_node1 13131 1726867223.78558: done getting next task for host managed_node1 13131 1726867223.78561: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867223.78565: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867223.78576: getting variables 13131 1726867223.78582: in VariableManager get_vars() 13131 1726867223.78629: Calling all_inventory to load vars for managed_node1 13131 1726867223.78632: Calling groups_inventory to load vars for managed_node1 13131 1726867223.78634: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867223.78642: Calling all_plugins_play to load vars for managed_node1 13131 1726867223.78645: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867223.78647: Calling groups_plugins_play to load vars for managed_node1 13131 1726867223.81831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867223.85392: done with get_vars() 13131 1726867223.85418: done getting variables 13131 1726867223.85683: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:20:23 -0400 (0:00:00.786) 0:00:38.967 ****** 13131 1726867223.85728: entering _queue_task() for managed_node1/debug 13131 1726867223.86463: worker is 1 (out of 1 available) 13131 1726867223.86476: exiting _queue_task() for managed_node1/debug 13131 1726867223.86489: done queuing things up, now waiting for results queue to drain 13131 1726867223.86490: waiting for pending results... 13131 1726867223.86998: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867223.87006: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000011c 13131 1726867223.87011: variable 'ansible_search_path' from source: unknown 13131 1726867223.87015: variable 'ansible_search_path' from source: unknown 13131 1726867223.87183: calling self._execute() 13131 1726867223.87187: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867223.87190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867223.87193: variable 'omit' from source: magic vars 13131 1726867223.87509: variable 'ansible_distribution_major_version' from source: facts 13131 1726867223.87526: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867223.87537: variable 'omit' from source: magic vars 13131 1726867223.87606: variable 'omit' from source: magic vars 13131 1726867223.87714: variable 'network_provider' from source: set_fact 13131 1726867223.87740: variable 'omit' from source: magic vars 13131 1726867223.87796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867223.87835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867223.87857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867223.87880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867223.87901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867223.87937: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867223.87945: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867223.87952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867223.88060: Set connection var ansible_connection to ssh 13131 1726867223.88074: Set connection var ansible_timeout to 10 13131 1726867223.88084: Set connection var ansible_shell_type to sh 13131 1726867223.88100: Set connection var ansible_shell_executable to /bin/sh 13131 1726867223.88120: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867223.88130: Set connection var ansible_pipelining to False 13131 1726867223.88217: variable 'ansible_shell_executable' from source: unknown 13131 1726867223.88220: variable 'ansible_connection' from source: unknown 13131 1726867223.88223: variable 'ansible_module_compression' from source: unknown 13131 1726867223.88225: variable 'ansible_shell_type' from source: unknown 13131 1726867223.88226: variable 'ansible_shell_executable' from source: unknown 13131 1726867223.88228: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867223.88230: variable 'ansible_pipelining' from source: unknown 13131 1726867223.88232: variable 'ansible_timeout' from source: unknown 13131 1726867223.88234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867223.88350: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867223.88366: variable 'omit' from source: magic vars 13131 1726867223.88375: starting attempt loop 13131 1726867223.88385: running the handler 13131 1726867223.88440: handler run complete 13131 1726867223.88459: attempt loop complete, returning result 13131 1726867223.88467: _execute() done 13131 1726867223.88544: dumping result to json 13131 1726867223.88548: done dumping result, returning 13131 1726867223.88550: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-5f24-9b7a-00000000011c] 13131 1726867223.88552: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011c 13131 1726867223.88618: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011c 13131 1726867223.88621: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 13131 1726867223.88716: no more pending results, returning what we have 13131 1726867223.88719: results queue empty 13131 1726867223.88720: checking for any_errors_fatal 13131 1726867223.88729: done checking for any_errors_fatal 13131 1726867223.88730: checking for max_fail_percentage 13131 1726867223.88732: done checking for max_fail_percentage 13131 1726867223.88733: checking to see if all hosts have failed and the running result is not ok 13131 1726867223.88734: done checking to see if all hosts have failed 13131 1726867223.88735: getting the remaining hosts for this loop 13131 1726867223.88736: done getting the remaining hosts for this loop 13131 1726867223.88740: getting the next task for host managed_node1 13131 1726867223.88746: done getting next task for host managed_node1 13131 1726867223.88750: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867223.88756: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867223.88768: getting variables 13131 1726867223.88769: in VariableManager get_vars() 13131 1726867223.88826: Calling all_inventory to load vars for managed_node1 13131 1726867223.88829: Calling groups_inventory to load vars for managed_node1 13131 1726867223.88832: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867223.88843: Calling all_plugins_play to load vars for managed_node1 13131 1726867223.88846: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867223.88850: Calling groups_plugins_play to load vars for managed_node1 13131 1726867223.90473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867223.92062: done with get_vars() 13131 1726867223.92084: done getting variables 13131 1726867223.92147: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:20:23 -0400 (0:00:00.064) 0:00:39.032 ****** 13131 1726867223.92181: entering _queue_task() for managed_node1/fail 13131 1726867223.92594: worker is 1 (out of 1 available) 13131 1726867223.92604: exiting _queue_task() for managed_node1/fail 13131 1726867223.92613: done queuing things up, now waiting for results queue to drain 13131 1726867223.92614: waiting for pending results... 13131 1726867223.92898: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867223.92954: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000011d 13131 1726867223.92980: variable 'ansible_search_path' from source: unknown 13131 1726867223.92994: variable 'ansible_search_path' from source: unknown 13131 1726867223.93038: calling self._execute() 13131 1726867223.93143: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867223.93211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867223.93214: variable 'omit' from source: magic vars 13131 1726867223.93784: variable 'ansible_distribution_major_version' from source: facts 13131 1726867223.93788: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867223.94052: variable 'network_state' from source: role '' defaults 13131 1726867223.94068: Evaluated conditional (network_state != {}): False 13131 1726867223.94076: when evaluation is False, skipping this task 13131 1726867223.94086: _execute() done 13131 1726867223.94114: dumping result to json 13131 1726867223.94123: done dumping result, returning 13131 1726867223.94135: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-5f24-9b7a-00000000011d] 13131 1726867223.94167: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867223.94320: no more pending results, returning what we have 13131 1726867223.94324: results queue empty 13131 1726867223.94325: checking for any_errors_fatal 13131 1726867223.94332: done checking for any_errors_fatal 13131 1726867223.94332: checking for max_fail_percentage 13131 1726867223.94334: done checking for max_fail_percentage 13131 1726867223.94335: checking to see if all hosts have failed and the running result is not ok 13131 1726867223.94336: done checking to see if all hosts have failed 13131 1726867223.94337: getting the remaining hosts for this loop 13131 1726867223.94338: done getting the remaining hosts for this loop 13131 1726867223.94342: getting the next task for host managed_node1 13131 1726867223.94349: done getting next task for host managed_node1 13131 1726867223.94353: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867223.94357: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867223.94384: getting variables 13131 1726867223.94385: in VariableManager get_vars() 13131 1726867223.94440: Calling all_inventory to load vars for managed_node1 13131 1726867223.94443: Calling groups_inventory to load vars for managed_node1 13131 1726867223.94446: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867223.94458: Calling all_plugins_play to load vars for managed_node1 13131 1726867223.94462: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867223.94465: Calling groups_plugins_play to load vars for managed_node1 13131 1726867223.95396: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011d 13131 1726867223.95400: WORKER PROCESS EXITING 13131 1726867223.97669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.04731: done with get_vars() 13131 1726867224.04754: done getting variables 13131 1726867224.04803: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:20:24 -0400 (0:00:00.126) 0:00:39.158 ****** 13131 1726867224.04834: entering _queue_task() for managed_node1/fail 13131 1726867224.05487: worker is 1 (out of 1 available) 13131 1726867224.05498: exiting _queue_task() for managed_node1/fail 13131 1726867224.05508: done queuing things up, now waiting for results queue to drain 13131 1726867224.05510: waiting for pending results... 13131 1726867224.05722: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867224.05875: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000011e 13131 1726867224.05904: variable 'ansible_search_path' from source: unknown 13131 1726867224.05912: variable 'ansible_search_path' from source: unknown 13131 1726867224.05952: calling self._execute() 13131 1726867224.06055: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.06067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.06081: variable 'omit' from source: magic vars 13131 1726867224.06464: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.06540: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.06682: variable 'network_state' from source: role '' defaults 13131 1726867224.06697: Evaluated conditional (network_state != {}): False 13131 1726867224.06705: when evaluation is False, skipping this task 13131 1726867224.06712: _execute() done 13131 1726867224.06718: dumping result to json 13131 1726867224.06725: done dumping result, returning 13131 1726867224.06734: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-5f24-9b7a-00000000011e] 13131 1726867224.06743: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867224.06914: no more pending results, returning what we have 13131 1726867224.06918: results queue empty 13131 1726867224.06919: checking for any_errors_fatal 13131 1726867224.06929: done checking for any_errors_fatal 13131 1726867224.06930: checking for max_fail_percentage 13131 1726867224.06932: done checking for max_fail_percentage 13131 1726867224.06933: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.06934: done checking to see if all hosts have failed 13131 1726867224.06934: getting the remaining hosts for this loop 13131 1726867224.06936: done getting the remaining hosts for this loop 13131 1726867224.06939: getting the next task for host managed_node1 13131 1726867224.06946: done getting next task for host managed_node1 13131 1726867224.06949: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867224.06953: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.07186: getting variables 13131 1726867224.07189: in VariableManager get_vars() 13131 1726867224.07232: Calling all_inventory to load vars for managed_node1 13131 1726867224.07235: Calling groups_inventory to load vars for managed_node1 13131 1726867224.07237: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.07248: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.07251: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.07254: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.07809: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011e 13131 1726867224.07813: WORKER PROCESS EXITING 13131 1726867224.09299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.11369: done with get_vars() 13131 1726867224.11391: done getting variables 13131 1726867224.11450: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:20:24 -0400 (0:00:00.066) 0:00:39.225 ****** 13131 1726867224.11494: entering _queue_task() for managed_node1/fail 13131 1726867224.11883: worker is 1 (out of 1 available) 13131 1726867224.11893: exiting _queue_task() for managed_node1/fail 13131 1726867224.11903: done queuing things up, now waiting for results queue to drain 13131 1726867224.11905: waiting for pending results... 13131 1726867224.12074: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867224.12224: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000011f 13131 1726867224.12251: variable 'ansible_search_path' from source: unknown 13131 1726867224.12263: variable 'ansible_search_path' from source: unknown 13131 1726867224.12306: calling self._execute() 13131 1726867224.12417: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.12434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.12452: variable 'omit' from source: magic vars 13131 1726867224.12851: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.12910: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.13058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867224.15861: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867224.16262: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867224.16384: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867224.16389: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867224.16392: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867224.16429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.16492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.16495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.16521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.16534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.16623: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.16637: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13131 1726867224.16816: variable 'ansible_distribution' from source: facts 13131 1726867224.16819: variable '__network_rh_distros' from source: role '' defaults 13131 1726867224.16822: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13131 1726867224.16999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.17028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.17082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.17093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.17111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.17161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.17187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.17252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.17258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.17270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.17316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.17345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.17380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.17404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.17469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.17739: variable 'network_connections' from source: task vars 13131 1726867224.17749: variable 'port1_profile' from source: play vars 13131 1726867224.17817: variable 'port1_profile' from source: play vars 13131 1726867224.17827: variable 'port2_profile' from source: play vars 13131 1726867224.17923: variable 'port2_profile' from source: play vars 13131 1726867224.17927: variable 'network_state' from source: role '' defaults 13131 1726867224.17964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867224.18131: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867224.18167: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867224.18197: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867224.18245: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867224.18288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867224.18312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867224.18344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.18369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867224.18395: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13131 1726867224.18399: when evaluation is False, skipping this task 13131 1726867224.18401: _execute() done 13131 1726867224.18404: dumping result to json 13131 1726867224.18448: done dumping result, returning 13131 1726867224.18452: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-5f24-9b7a-00000000011f] 13131 1726867224.18460: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011f 13131 1726867224.18526: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000011f 13131 1726867224.18528: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13131 1726867224.18613: no more pending results, returning what we have 13131 1726867224.18616: results queue empty 13131 1726867224.18617: checking for any_errors_fatal 13131 1726867224.18622: done checking for any_errors_fatal 13131 1726867224.18623: checking for max_fail_percentage 13131 1726867224.18624: done checking for max_fail_percentage 13131 1726867224.18625: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.18626: done checking to see if all hosts have failed 13131 1726867224.18626: getting the remaining hosts for this loop 13131 1726867224.18628: done getting the remaining hosts for this loop 13131 1726867224.18631: getting the next task for host managed_node1 13131 1726867224.18637: done getting next task for host managed_node1 13131 1726867224.18640: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867224.18643: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.18663: getting variables 13131 1726867224.18664: in VariableManager get_vars() 13131 1726867224.18710: Calling all_inventory to load vars for managed_node1 13131 1726867224.18712: Calling groups_inventory to load vars for managed_node1 13131 1726867224.18714: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.18723: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.18725: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.18728: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.20371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.22020: done with get_vars() 13131 1726867224.22042: done getting variables 13131 1726867224.22100: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:20:24 -0400 (0:00:00.106) 0:00:39.331 ****** 13131 1726867224.22141: entering _queue_task() for managed_node1/dnf 13131 1726867224.22580: worker is 1 (out of 1 available) 13131 1726867224.22591: exiting _queue_task() for managed_node1/dnf 13131 1726867224.22601: done queuing things up, now waiting for results queue to drain 13131 1726867224.22605: waiting for pending results... 13131 1726867224.22898: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867224.22914: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000120 13131 1726867224.22928: variable 'ansible_search_path' from source: unknown 13131 1726867224.22932: variable 'ansible_search_path' from source: unknown 13131 1726867224.22973: calling self._execute() 13131 1726867224.23075: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.23084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.23093: variable 'omit' from source: magic vars 13131 1726867224.23476: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.23536: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.23699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867224.26110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867224.26179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867224.26350: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867224.26354: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867224.26358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867224.26404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.26443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.26473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.26514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.26539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.26654: variable 'ansible_distribution' from source: facts 13131 1726867224.26658: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.26691: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13131 1726867224.26785: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867224.26920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.26987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.26991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.27017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.27031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.27079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.27113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.27137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.27192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.27217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.27270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.27294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.27314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.27338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.27348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.27461: variable 'network_connections' from source: task vars 13131 1726867224.27473: variable 'port1_profile' from source: play vars 13131 1726867224.27519: variable 'port1_profile' from source: play vars 13131 1726867224.27528: variable 'port2_profile' from source: play vars 13131 1726867224.27569: variable 'port2_profile' from source: play vars 13131 1726867224.27624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867224.27749: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867224.27775: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867224.27800: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867224.27826: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867224.27856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867224.27871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867224.27894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.27916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867224.27953: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867224.28103: variable 'network_connections' from source: task vars 13131 1726867224.28110: variable 'port1_profile' from source: play vars 13131 1726867224.28163: variable 'port1_profile' from source: play vars 13131 1726867224.28169: variable 'port2_profile' from source: play vars 13131 1726867224.28217: variable 'port2_profile' from source: play vars 13131 1726867224.28236: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867224.28240: when evaluation is False, skipping this task 13131 1726867224.28242: _execute() done 13131 1726867224.28246: dumping result to json 13131 1726867224.28249: done dumping result, returning 13131 1726867224.28256: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000120] 13131 1726867224.28258: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000120 13131 1726867224.28342: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000120 13131 1726867224.28344: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867224.28405: no more pending results, returning what we have 13131 1726867224.28408: results queue empty 13131 1726867224.28409: checking for any_errors_fatal 13131 1726867224.28415: done checking for any_errors_fatal 13131 1726867224.28416: checking for max_fail_percentage 13131 1726867224.28418: done checking for max_fail_percentage 13131 1726867224.28419: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.28420: done checking to see if all hosts have failed 13131 1726867224.28421: getting the remaining hosts for this loop 13131 1726867224.28422: done getting the remaining hosts for this loop 13131 1726867224.28425: getting the next task for host managed_node1 13131 1726867224.28432: done getting next task for host managed_node1 13131 1726867224.28435: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867224.28438: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.28458: getting variables 13131 1726867224.28460: in VariableManager get_vars() 13131 1726867224.28506: Calling all_inventory to load vars for managed_node1 13131 1726867224.28509: Calling groups_inventory to load vars for managed_node1 13131 1726867224.28511: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.28520: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.28522: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.28525: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.29556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.30655: done with get_vars() 13131 1726867224.30670: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867224.30724: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:20:24 -0400 (0:00:00.086) 0:00:39.418 ****** 13131 1726867224.30746: entering _queue_task() for managed_node1/yum 13131 1726867224.30958: worker is 1 (out of 1 available) 13131 1726867224.30972: exiting _queue_task() for managed_node1/yum 13131 1726867224.30985: done queuing things up, now waiting for results queue to drain 13131 1726867224.30986: waiting for pending results... 13131 1726867224.31160: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867224.31249: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000121 13131 1726867224.31260: variable 'ansible_search_path' from source: unknown 13131 1726867224.31263: variable 'ansible_search_path' from source: unknown 13131 1726867224.31293: calling self._execute() 13131 1726867224.31367: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.31370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.31380: variable 'omit' from source: magic vars 13131 1726867224.31648: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.31658: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.31889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867224.34397: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867224.34440: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867224.34500: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867224.34583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867224.34586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867224.34652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.34687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.34727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.34768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.34813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.34893: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.34924: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13131 1726867224.34933: when evaluation is False, skipping this task 13131 1726867224.34940: _execute() done 13131 1726867224.35082: dumping result to json 13131 1726867224.35085: done dumping result, returning 13131 1726867224.35088: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000121] 13131 1726867224.35090: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000121 13131 1726867224.35154: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000121 13131 1726867224.35157: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13131 1726867224.35212: no more pending results, returning what we have 13131 1726867224.35215: results queue empty 13131 1726867224.35216: checking for any_errors_fatal 13131 1726867224.35221: done checking for any_errors_fatal 13131 1726867224.35222: checking for max_fail_percentage 13131 1726867224.35224: done checking for max_fail_percentage 13131 1726867224.35224: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.35225: done checking to see if all hosts have failed 13131 1726867224.35226: getting the remaining hosts for this loop 13131 1726867224.35227: done getting the remaining hosts for this loop 13131 1726867224.35230: getting the next task for host managed_node1 13131 1726867224.35237: done getting next task for host managed_node1 13131 1726867224.35241: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867224.35244: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.35264: getting variables 13131 1726867224.35381: in VariableManager get_vars() 13131 1726867224.35437: Calling all_inventory to load vars for managed_node1 13131 1726867224.35440: Calling groups_inventory to load vars for managed_node1 13131 1726867224.35443: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.35453: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.35456: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.35458: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.38384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.40168: done with get_vars() 13131 1726867224.40197: done getting variables 13131 1726867224.40260: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:20:24 -0400 (0:00:00.095) 0:00:39.513 ****** 13131 1726867224.40306: entering _queue_task() for managed_node1/fail 13131 1726867224.40696: worker is 1 (out of 1 available) 13131 1726867224.40710: exiting _queue_task() for managed_node1/fail 13131 1726867224.40723: done queuing things up, now waiting for results queue to drain 13131 1726867224.40725: waiting for pending results... 13131 1726867224.41076: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867224.41185: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000122 13131 1726867224.41208: variable 'ansible_search_path' from source: unknown 13131 1726867224.41234: variable 'ansible_search_path' from source: unknown 13131 1726867224.41488: calling self._execute() 13131 1726867224.41532: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.41545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.41560: variable 'omit' from source: magic vars 13131 1726867224.42374: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.42395: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.42566: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867224.42770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867224.44245: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867224.44298: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867224.44328: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867224.44352: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867224.44373: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867224.44446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.44467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.44487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.44592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.44595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.44600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.44621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.44733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.44736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.44738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.44741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.44758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.44983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.44986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.44989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.45011: variable 'network_connections' from source: task vars 13131 1726867224.45014: variable 'port1_profile' from source: play vars 13131 1726867224.45075: variable 'port1_profile' from source: play vars 13131 1726867224.45087: variable 'port2_profile' from source: play vars 13131 1726867224.45146: variable 'port2_profile' from source: play vars 13131 1726867224.45215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867224.45380: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867224.45419: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867224.45448: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867224.45476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867224.45521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867224.45545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867224.45567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.45594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867224.45650: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867224.45834: variable 'network_connections' from source: task vars 13131 1726867224.45837: variable 'port1_profile' from source: play vars 13131 1726867224.45884: variable 'port1_profile' from source: play vars 13131 1726867224.45891: variable 'port2_profile' from source: play vars 13131 1726867224.45936: variable 'port2_profile' from source: play vars 13131 1726867224.45954: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867224.45957: when evaluation is False, skipping this task 13131 1726867224.45960: _execute() done 13131 1726867224.45962: dumping result to json 13131 1726867224.45964: done dumping result, returning 13131 1726867224.45974: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000122] 13131 1726867224.45987: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000122 13131 1726867224.46063: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000122 13131 1726867224.46066: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867224.46125: no more pending results, returning what we have 13131 1726867224.46128: results queue empty 13131 1726867224.46129: checking for any_errors_fatal 13131 1726867224.46134: done checking for any_errors_fatal 13131 1726867224.46134: checking for max_fail_percentage 13131 1726867224.46136: done checking for max_fail_percentage 13131 1726867224.46137: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.46138: done checking to see if all hosts have failed 13131 1726867224.46138: getting the remaining hosts for this loop 13131 1726867224.46139: done getting the remaining hosts for this loop 13131 1726867224.46143: getting the next task for host managed_node1 13131 1726867224.46149: done getting next task for host managed_node1 13131 1726867224.46152: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13131 1726867224.46155: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.46173: getting variables 13131 1726867224.46174: in VariableManager get_vars() 13131 1726867224.46225: Calling all_inventory to load vars for managed_node1 13131 1726867224.46227: Calling groups_inventory to load vars for managed_node1 13131 1726867224.46229: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.46238: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.46240: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.46242: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.47033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.48215: done with get_vars() 13131 1726867224.48234: done getting variables 13131 1726867224.48292: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:20:24 -0400 (0:00:00.080) 0:00:39.593 ****** 13131 1726867224.48327: entering _queue_task() for managed_node1/package 13131 1726867224.48623: worker is 1 (out of 1 available) 13131 1726867224.48636: exiting _queue_task() for managed_node1/package 13131 1726867224.48646: done queuing things up, now waiting for results queue to drain 13131 1726867224.48648: waiting for pending results... 13131 1726867224.48994: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13131 1726867224.49083: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000123 13131 1726867224.49116: variable 'ansible_search_path' from source: unknown 13131 1726867224.49119: variable 'ansible_search_path' from source: unknown 13131 1726867224.49147: calling self._execute() 13131 1726867224.49227: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.49232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.49243: variable 'omit' from source: magic vars 13131 1726867224.49531: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.49542: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.49670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867224.49860: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867224.49895: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867224.49922: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867224.49973: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867224.50048: variable 'network_packages' from source: role '' defaults 13131 1726867224.50121: variable '__network_provider_setup' from source: role '' defaults 13131 1726867224.50130: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867224.50174: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867224.50183: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867224.50229: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867224.50344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867224.52009: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867224.52082: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867224.52099: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867224.52169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867224.52172: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867224.52239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.52265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.52290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.52388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.52391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.52394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.52406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.52428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.52467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.52483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.52694: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867224.52775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.52793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.52811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.52844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.52856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.52915: variable 'ansible_python' from source: facts 13131 1726867224.52939: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867224.52991: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867224.53048: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867224.53125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.53143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.53162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.53188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.53198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.53230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.53250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.53270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.53296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.53374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.53404: variable 'network_connections' from source: task vars 13131 1726867224.53408: variable 'port1_profile' from source: play vars 13131 1726867224.53475: variable 'port1_profile' from source: play vars 13131 1726867224.53488: variable 'port2_profile' from source: play vars 13131 1726867224.53551: variable 'port2_profile' from source: play vars 13131 1726867224.53607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867224.53625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867224.53645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.53666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867224.53816: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867224.53881: variable 'network_connections' from source: task vars 13131 1726867224.53884: variable 'port1_profile' from source: play vars 13131 1726867224.53957: variable 'port1_profile' from source: play vars 13131 1726867224.53964: variable 'port2_profile' from source: play vars 13131 1726867224.54032: variable 'port2_profile' from source: play vars 13131 1726867224.54059: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867224.54114: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867224.54310: variable 'network_connections' from source: task vars 13131 1726867224.54314: variable 'port1_profile' from source: play vars 13131 1726867224.54361: variable 'port1_profile' from source: play vars 13131 1726867224.54365: variable 'port2_profile' from source: play vars 13131 1726867224.54413: variable 'port2_profile' from source: play vars 13131 1726867224.54429: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867224.54540: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867224.54852: variable 'network_connections' from source: task vars 13131 1726867224.54992: variable 'port1_profile' from source: play vars 13131 1726867224.54995: variable 'port1_profile' from source: play vars 13131 1726867224.54997: variable 'port2_profile' from source: play vars 13131 1726867224.55012: variable 'port2_profile' from source: play vars 13131 1726867224.55063: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867224.55131: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867224.55143: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867224.55216: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867224.55525: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867224.55840: variable 'network_connections' from source: task vars 13131 1726867224.55843: variable 'port1_profile' from source: play vars 13131 1726867224.55888: variable 'port1_profile' from source: play vars 13131 1726867224.55894: variable 'port2_profile' from source: play vars 13131 1726867224.55936: variable 'port2_profile' from source: play vars 13131 1726867224.55942: variable 'ansible_distribution' from source: facts 13131 1726867224.55945: variable '__network_rh_distros' from source: role '' defaults 13131 1726867224.55952: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.55963: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867224.56069: variable 'ansible_distribution' from source: facts 13131 1726867224.56073: variable '__network_rh_distros' from source: role '' defaults 13131 1726867224.56076: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.56092: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867224.56196: variable 'ansible_distribution' from source: facts 13131 1726867224.56199: variable '__network_rh_distros' from source: role '' defaults 13131 1726867224.56205: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.56231: variable 'network_provider' from source: set_fact 13131 1726867224.56242: variable 'ansible_facts' from source: unknown 13131 1726867224.56674: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13131 1726867224.56680: when evaluation is False, skipping this task 13131 1726867224.56683: _execute() done 13131 1726867224.56686: dumping result to json 13131 1726867224.56688: done dumping result, returning 13131 1726867224.56694: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-5f24-9b7a-000000000123] 13131 1726867224.56698: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000123 13131 1726867224.56786: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000123 13131 1726867224.56788: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13131 1726867224.56841: no more pending results, returning what we have 13131 1726867224.56844: results queue empty 13131 1726867224.56845: checking for any_errors_fatal 13131 1726867224.56851: done checking for any_errors_fatal 13131 1726867224.56852: checking for max_fail_percentage 13131 1726867224.56854: done checking for max_fail_percentage 13131 1726867224.56855: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.56855: done checking to see if all hosts have failed 13131 1726867224.56856: getting the remaining hosts for this loop 13131 1726867224.56857: done getting the remaining hosts for this loop 13131 1726867224.56861: getting the next task for host managed_node1 13131 1726867224.56867: done getting next task for host managed_node1 13131 1726867224.56870: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867224.56874: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.56901: getting variables 13131 1726867224.56905: in VariableManager get_vars() 13131 1726867224.56956: Calling all_inventory to load vars for managed_node1 13131 1726867224.56958: Calling groups_inventory to load vars for managed_node1 13131 1726867224.56961: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.56970: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.56972: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.56975: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.57785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.58655: done with get_vars() 13131 1726867224.58672: done getting variables 13131 1726867224.58719: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:20:24 -0400 (0:00:00.104) 0:00:39.698 ****** 13131 1726867224.58744: entering _queue_task() for managed_node1/package 13131 1726867224.58998: worker is 1 (out of 1 available) 13131 1726867224.59010: exiting _queue_task() for managed_node1/package 13131 1726867224.59022: done queuing things up, now waiting for results queue to drain 13131 1726867224.59023: waiting for pending results... 13131 1726867224.59206: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867224.59312: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000124 13131 1726867224.59324: variable 'ansible_search_path' from source: unknown 13131 1726867224.59327: variable 'ansible_search_path' from source: unknown 13131 1726867224.59358: calling self._execute() 13131 1726867224.59438: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.59442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.59451: variable 'omit' from source: magic vars 13131 1726867224.59738: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.59748: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.59838: variable 'network_state' from source: role '' defaults 13131 1726867224.59846: Evaluated conditional (network_state != {}): False 13131 1726867224.59849: when evaluation is False, skipping this task 13131 1726867224.59851: _execute() done 13131 1726867224.59854: dumping result to json 13131 1726867224.59856: done dumping result, returning 13131 1726867224.59865: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-000000000124] 13131 1726867224.59869: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000124 13131 1726867224.59964: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000124 13131 1726867224.59966: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867224.60018: no more pending results, returning what we have 13131 1726867224.60022: results queue empty 13131 1726867224.60023: checking for any_errors_fatal 13131 1726867224.60029: done checking for any_errors_fatal 13131 1726867224.60030: checking for max_fail_percentage 13131 1726867224.60031: done checking for max_fail_percentage 13131 1726867224.60032: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.60033: done checking to see if all hosts have failed 13131 1726867224.60034: getting the remaining hosts for this loop 13131 1726867224.60035: done getting the remaining hosts for this loop 13131 1726867224.60038: getting the next task for host managed_node1 13131 1726867224.60044: done getting next task for host managed_node1 13131 1726867224.60049: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867224.60053: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.60073: getting variables 13131 1726867224.60074: in VariableManager get_vars() 13131 1726867224.60130: Calling all_inventory to load vars for managed_node1 13131 1726867224.60132: Calling groups_inventory to load vars for managed_node1 13131 1726867224.60135: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.60143: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.60146: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.60148: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.61063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.61926: done with get_vars() 13131 1726867224.61941: done getting variables 13131 1726867224.61984: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:20:24 -0400 (0:00:00.032) 0:00:39.730 ****** 13131 1726867224.62012: entering _queue_task() for managed_node1/package 13131 1726867224.62250: worker is 1 (out of 1 available) 13131 1726867224.62263: exiting _queue_task() for managed_node1/package 13131 1726867224.62274: done queuing things up, now waiting for results queue to drain 13131 1726867224.62275: waiting for pending results... 13131 1726867224.62456: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867224.62550: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000125 13131 1726867224.62562: variable 'ansible_search_path' from source: unknown 13131 1726867224.62565: variable 'ansible_search_path' from source: unknown 13131 1726867224.62595: calling self._execute() 13131 1726867224.62675: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.62680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.62691: variable 'omit' from source: magic vars 13131 1726867224.62957: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.62967: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.63051: variable 'network_state' from source: role '' defaults 13131 1726867224.63055: Evaluated conditional (network_state != {}): False 13131 1726867224.63058: when evaluation is False, skipping this task 13131 1726867224.63061: _execute() done 13131 1726867224.63063: dumping result to json 13131 1726867224.63068: done dumping result, returning 13131 1726867224.63075: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-000000000125] 13131 1726867224.63082: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000125 13131 1726867224.63170: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000125 13131 1726867224.63173: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867224.63223: no more pending results, returning what we have 13131 1726867224.63226: results queue empty 13131 1726867224.63227: checking for any_errors_fatal 13131 1726867224.63233: done checking for any_errors_fatal 13131 1726867224.63234: checking for max_fail_percentage 13131 1726867224.63236: done checking for max_fail_percentage 13131 1726867224.63237: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.63237: done checking to see if all hosts have failed 13131 1726867224.63238: getting the remaining hosts for this loop 13131 1726867224.63239: done getting the remaining hosts for this loop 13131 1726867224.63242: getting the next task for host managed_node1 13131 1726867224.63249: done getting next task for host managed_node1 13131 1726867224.63252: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867224.63255: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.63273: getting variables 13131 1726867224.63274: in VariableManager get_vars() 13131 1726867224.63324: Calling all_inventory to load vars for managed_node1 13131 1726867224.63327: Calling groups_inventory to load vars for managed_node1 13131 1726867224.63329: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.63337: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.63340: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.63342: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.64094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.64965: done with get_vars() 13131 1726867224.64981: done getting variables 13131 1726867224.65025: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:20:24 -0400 (0:00:00.030) 0:00:39.761 ****** 13131 1726867224.65049: entering _queue_task() for managed_node1/service 13131 1726867224.65256: worker is 1 (out of 1 available) 13131 1726867224.65268: exiting _queue_task() for managed_node1/service 13131 1726867224.65280: done queuing things up, now waiting for results queue to drain 13131 1726867224.65281: waiting for pending results... 13131 1726867224.65452: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867224.65540: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000126 13131 1726867224.65553: variable 'ansible_search_path' from source: unknown 13131 1726867224.65557: variable 'ansible_search_path' from source: unknown 13131 1726867224.65586: calling self._execute() 13131 1726867224.65662: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.65666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.65676: variable 'omit' from source: magic vars 13131 1726867224.65938: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.65954: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.66039: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867224.66174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867224.67868: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867224.67914: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867224.67950: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867224.67975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867224.67996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867224.68057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.68076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.68095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.68127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.68138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.68169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.68187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.68203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.68232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.68244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.68271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.68288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.68306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.68337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.68343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.68515: variable 'network_connections' from source: task vars 13131 1726867224.68519: variable 'port1_profile' from source: play vars 13131 1726867224.68583: variable 'port1_profile' from source: play vars 13131 1726867224.68593: variable 'port2_profile' from source: play vars 13131 1726867224.68789: variable 'port2_profile' from source: play vars 13131 1726867224.68793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867224.68903: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867224.68947: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867224.68986: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867224.69041: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867224.69093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867224.69133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867224.69166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.69201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867224.69268: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867224.69533: variable 'network_connections' from source: task vars 13131 1726867224.69554: variable 'port1_profile' from source: play vars 13131 1726867224.69621: variable 'port1_profile' from source: play vars 13131 1726867224.69636: variable 'port2_profile' from source: play vars 13131 1726867224.69712: variable 'port2_profile' from source: play vars 13131 1726867224.69742: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867224.69752: when evaluation is False, skipping this task 13131 1726867224.69768: _execute() done 13131 1726867224.69783: dumping result to json 13131 1726867224.69791: done dumping result, returning 13131 1726867224.69804: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000126] 13131 1726867224.69882: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000126 13131 1726867224.69952: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000126 13131 1726867224.69956: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867224.70016: no more pending results, returning what we have 13131 1726867224.70020: results queue empty 13131 1726867224.70021: checking for any_errors_fatal 13131 1726867224.70027: done checking for any_errors_fatal 13131 1726867224.70028: checking for max_fail_percentage 13131 1726867224.70030: done checking for max_fail_percentage 13131 1726867224.70031: checking to see if all hosts have failed and the running result is not ok 13131 1726867224.70032: done checking to see if all hosts have failed 13131 1726867224.70032: getting the remaining hosts for this loop 13131 1726867224.70034: done getting the remaining hosts for this loop 13131 1726867224.70037: getting the next task for host managed_node1 13131 1726867224.70045: done getting next task for host managed_node1 13131 1726867224.70049: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867224.70052: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867224.70071: getting variables 13131 1726867224.70073: in VariableManager get_vars() 13131 1726867224.70253: Calling all_inventory to load vars for managed_node1 13131 1726867224.70257: Calling groups_inventory to load vars for managed_node1 13131 1726867224.70259: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867224.70270: Calling all_plugins_play to load vars for managed_node1 13131 1726867224.70274: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867224.70422: Calling groups_plugins_play to load vars for managed_node1 13131 1726867224.72434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867224.74008: done with get_vars() 13131 1726867224.74023: done getting variables 13131 1726867224.74064: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:20:24 -0400 (0:00:00.090) 0:00:39.851 ****** 13131 1726867224.74089: entering _queue_task() for managed_node1/service 13131 1726867224.74317: worker is 1 (out of 1 available) 13131 1726867224.74330: exiting _queue_task() for managed_node1/service 13131 1726867224.74341: done queuing things up, now waiting for results queue to drain 13131 1726867224.74343: waiting for pending results... 13131 1726867224.74530: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867224.74623: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000127 13131 1726867224.74635: variable 'ansible_search_path' from source: unknown 13131 1726867224.74638: variable 'ansible_search_path' from source: unknown 13131 1726867224.74667: calling self._execute() 13131 1726867224.74746: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.74750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.74761: variable 'omit' from source: magic vars 13131 1726867224.75036: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.75045: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867224.75152: variable 'network_provider' from source: set_fact 13131 1726867224.75156: variable 'network_state' from source: role '' defaults 13131 1726867224.75164: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13131 1726867224.75171: variable 'omit' from source: magic vars 13131 1726867224.75213: variable 'omit' from source: magic vars 13131 1726867224.75237: variable 'network_service_name' from source: role '' defaults 13131 1726867224.75283: variable 'network_service_name' from source: role '' defaults 13131 1726867224.75353: variable '__network_provider_setup' from source: role '' defaults 13131 1726867224.75357: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867224.75405: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867224.75411: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867224.75455: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867224.75806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867224.78459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867224.78534: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867224.78573: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867224.78613: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867224.78641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867224.78719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.78752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.78783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.78826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.78844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.78892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.78920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.78947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.78989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.79182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.79230: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867224.79339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.79367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.79397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.79438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.79456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.79614: variable 'ansible_python' from source: facts 13131 1726867224.79638: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867224.79716: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867224.79795: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867224.79919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.79948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.79976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.80019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.80035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.80076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867224.80115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867224.80143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.80189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867224.80208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867224.80342: variable 'network_connections' from source: task vars 13131 1726867224.80355: variable 'port1_profile' from source: play vars 13131 1726867224.80432: variable 'port1_profile' from source: play vars 13131 1726867224.80451: variable 'port2_profile' from source: play vars 13131 1726867224.80520: variable 'port2_profile' from source: play vars 13131 1726867224.80615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867224.80792: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867224.80842: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867224.80890: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867224.80934: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867224.80992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867224.81022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867224.81184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867224.81187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867224.81189: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867224.81385: variable 'network_connections' from source: task vars 13131 1726867224.81396: variable 'port1_profile' from source: play vars 13131 1726867224.81467: variable 'port1_profile' from source: play vars 13131 1726867224.81782: variable 'port2_profile' from source: play vars 13131 1726867224.81786: variable 'port2_profile' from source: play vars 13131 1726867224.81801: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867224.81881: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867224.82401: variable 'network_connections' from source: task vars 13131 1726867224.82405: variable 'port1_profile' from source: play vars 13131 1726867224.82630: variable 'port1_profile' from source: play vars 13131 1726867224.83082: variable 'port2_profile' from source: play vars 13131 1726867224.83085: variable 'port2_profile' from source: play vars 13131 1726867224.83088: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867224.83090: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867224.83547: variable 'network_connections' from source: task vars 13131 1726867224.83621: variable 'port1_profile' from source: play vars 13131 1726867224.83715: variable 'port1_profile' from source: play vars 13131 1726867224.83893: variable 'port2_profile' from source: play vars 13131 1726867224.84041: variable 'port2_profile' from source: play vars 13131 1726867224.84107: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867224.84340: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867224.84351: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867224.84413: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867224.84656: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867224.85152: variable 'network_connections' from source: task vars 13131 1726867224.85162: variable 'port1_profile' from source: play vars 13131 1726867224.85224: variable 'port1_profile' from source: play vars 13131 1726867224.85237: variable 'port2_profile' from source: play vars 13131 1726867224.85302: variable 'port2_profile' from source: play vars 13131 1726867224.85315: variable 'ansible_distribution' from source: facts 13131 1726867224.85322: variable '__network_rh_distros' from source: role '' defaults 13131 1726867224.85332: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.85349: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867224.85508: variable 'ansible_distribution' from source: facts 13131 1726867224.85517: variable '__network_rh_distros' from source: role '' defaults 13131 1726867224.85526: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.85541: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867224.85702: variable 'ansible_distribution' from source: facts 13131 1726867224.85712: variable '__network_rh_distros' from source: role '' defaults 13131 1726867224.85720: variable 'ansible_distribution_major_version' from source: facts 13131 1726867224.85754: variable 'network_provider' from source: set_fact 13131 1726867224.85780: variable 'omit' from source: magic vars 13131 1726867224.85812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867224.85842: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867224.86282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867224.86286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867224.86289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867224.86291: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867224.86293: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.86295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.86297: Set connection var ansible_connection to ssh 13131 1726867224.86299: Set connection var ansible_timeout to 10 13131 1726867224.86300: Set connection var ansible_shell_type to sh 13131 1726867224.86302: Set connection var ansible_shell_executable to /bin/sh 13131 1726867224.86307: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867224.86319: Set connection var ansible_pipelining to False 13131 1726867224.86349: variable 'ansible_shell_executable' from source: unknown 13131 1726867224.86682: variable 'ansible_connection' from source: unknown 13131 1726867224.86686: variable 'ansible_module_compression' from source: unknown 13131 1726867224.86688: variable 'ansible_shell_type' from source: unknown 13131 1726867224.86697: variable 'ansible_shell_executable' from source: unknown 13131 1726867224.86700: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867224.86705: variable 'ansible_pipelining' from source: unknown 13131 1726867224.86707: variable 'ansible_timeout' from source: unknown 13131 1726867224.86710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867224.86713: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867224.86715: variable 'omit' from source: magic vars 13131 1726867224.86717: starting attempt loop 13131 1726867224.86719: running the handler 13131 1726867224.86780: variable 'ansible_facts' from source: unknown 13131 1726867224.88240: _low_level_execute_command(): starting 13131 1726867224.88475: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867224.89930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867224.89945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867224.90111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867224.90123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867224.90274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867224.91972: stdout chunk (state=3): >>>/root <<< 13131 1726867224.92105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867224.92118: stdout chunk (state=3): >>><<< 13131 1726867224.92131: stderr chunk (state=3): >>><<< 13131 1726867224.92154: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867224.92174: _low_level_execute_command(): starting 13131 1726867224.92188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946 `" && echo ansible-tmp-1726867224.9216027-15018-75720344415946="` echo /root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946 `" ) && sleep 0' 13131 1726867224.92967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867224.92984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867224.92998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867224.93092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867224.93208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867224.93305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867224.93373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867224.95245: stdout chunk (state=3): >>>ansible-tmp-1726867224.9216027-15018-75720344415946=/root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946 <<< 13131 1726867224.95429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867224.95465: stdout chunk (state=3): >>><<< 13131 1726867224.95496: stderr chunk (state=3): >>><<< 13131 1726867224.95520: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867224.9216027-15018-75720344415946=/root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867224.95572: variable 'ansible_module_compression' from source: unknown 13131 1726867224.95641: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13131 1726867224.95764: variable 'ansible_facts' from source: unknown 13131 1726867224.96022: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/AnsiballZ_systemd.py 13131 1726867224.96617: Sending initial data 13131 1726867224.96620: Sent initial data (155 bytes) 13131 1726867224.97640: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867224.97655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867224.97667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867224.97722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867224.97733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867224.97922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867224.98011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867224.99548: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867224.99794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867225.00040: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpj0p4rqge /root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/AnsiballZ_systemd.py <<< 13131 1726867225.00060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/AnsiballZ_systemd.py" <<< 13131 1726867225.00108: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpj0p4rqge" to remote "/root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/AnsiballZ_systemd.py" <<< 13131 1726867225.02944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867225.02971: stderr chunk (state=3): >>><<< 13131 1726867225.02995: stdout chunk (state=3): >>><<< 13131 1726867225.03051: done transferring module to remote 13131 1726867225.03066: _low_level_execute_command(): starting 13131 1726867225.03074: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/ /root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/AnsiballZ_systemd.py && sleep 0' 13131 1726867225.03697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867225.03760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.03828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867225.03843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867225.03871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867225.03950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867225.05837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867225.05862: stderr chunk (state=3): >>><<< 13131 1726867225.05866: stdout chunk (state=3): >>><<< 13131 1726867225.05893: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867225.05971: _low_level_execute_command(): starting 13131 1726867225.05975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/AnsiballZ_systemd.py && sleep 0' 13131 1726867225.06550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867225.06553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867225.06556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867225.06558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867225.06560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867225.06563: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867225.06565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.06567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867225.06569: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867225.06576: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867225.06591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867225.06593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867225.06641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867225.06645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867225.06647: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867225.06649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.06700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867225.06723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867225.06749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867225.06804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867225.35807: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10846208", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3291508736", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "977001000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 13131 1726867225.35828: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-br<<< 13131 1726867225.35837: stdout chunk (state=3): >>>oker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13131 1726867225.37884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867225.37887: stdout chunk (state=3): >>><<< 13131 1726867225.37890: stderr chunk (state=3): >>><<< 13131 1726867225.37898: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10846208", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3291508736", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "977001000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867225.37934: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867225.37953: _low_level_execute_command(): starting 13131 1726867225.37958: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867224.9216027-15018-75720344415946/ > /dev/null 2>&1 && sleep 0' 13131 1726867225.38564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867225.38576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867225.38683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867225.38727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867225.38764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867225.40600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867225.40603: stdout chunk (state=3): >>><<< 13131 1726867225.40613: stderr chunk (state=3): >>><<< 13131 1726867225.40653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867225.40659: handler run complete 13131 1726867225.40796: attempt loop complete, returning result 13131 1726867225.40799: _execute() done 13131 1726867225.40802: dumping result to json 13131 1726867225.40823: done dumping result, returning 13131 1726867225.40832: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-5f24-9b7a-000000000127] 13131 1726867225.40835: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000127 13131 1726867225.41483: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000127 13131 1726867225.41487: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867225.41672: no more pending results, returning what we have 13131 1726867225.41675: results queue empty 13131 1726867225.41676: checking for any_errors_fatal 13131 1726867225.41683: done checking for any_errors_fatal 13131 1726867225.41683: checking for max_fail_percentage 13131 1726867225.41686: done checking for max_fail_percentage 13131 1726867225.41686: checking to see if all hosts have failed and the running result is not ok 13131 1726867225.41687: done checking to see if all hosts have failed 13131 1726867225.41688: getting the remaining hosts for this loop 13131 1726867225.41689: done getting the remaining hosts for this loop 13131 1726867225.41692: getting the next task for host managed_node1 13131 1726867225.41697: done getting next task for host managed_node1 13131 1726867225.41700: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867225.41706: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867225.41717: getting variables 13131 1726867225.41718: in VariableManager get_vars() 13131 1726867225.41758: Calling all_inventory to load vars for managed_node1 13131 1726867225.41761: Calling groups_inventory to load vars for managed_node1 13131 1726867225.41763: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867225.41772: Calling all_plugins_play to load vars for managed_node1 13131 1726867225.41774: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867225.41781: Calling groups_plugins_play to load vars for managed_node1 13131 1726867225.44019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867225.45919: done with get_vars() 13131 1726867225.45941: done getting variables 13131 1726867225.46010: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:20:25 -0400 (0:00:00.719) 0:00:40.571 ****** 13131 1726867225.46043: entering _queue_task() for managed_node1/service 13131 1726867225.46359: worker is 1 (out of 1 available) 13131 1726867225.46370: exiting _queue_task() for managed_node1/service 13131 1726867225.46383: done queuing things up, now waiting for results queue to drain 13131 1726867225.46385: waiting for pending results... 13131 1726867225.46681: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867225.46815: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000128 13131 1726867225.46829: variable 'ansible_search_path' from source: unknown 13131 1726867225.46832: variable 'ansible_search_path' from source: unknown 13131 1726867225.46874: calling self._execute() 13131 1726867225.46969: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867225.46994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867225.47082: variable 'omit' from source: magic vars 13131 1726867225.47390: variable 'ansible_distribution_major_version' from source: facts 13131 1726867225.47401: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867225.47529: variable 'network_provider' from source: set_fact 13131 1726867225.47535: Evaluated conditional (network_provider == "nm"): True 13131 1726867225.47633: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867225.47723: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867225.47902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867225.50075: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867225.50146: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867225.50182: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867225.50217: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867225.50248: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867225.50482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867225.50486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867225.50489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867225.50492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867225.50494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867225.50496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867225.50519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867225.50541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867225.50579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867225.50599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867225.50640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867225.50661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867225.50685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867225.50731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867225.50743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867225.50886: variable 'network_connections' from source: task vars 13131 1726867225.50897: variable 'port1_profile' from source: play vars 13131 1726867225.50965: variable 'port1_profile' from source: play vars 13131 1726867225.50976: variable 'port2_profile' from source: play vars 13131 1726867225.51045: variable 'port2_profile' from source: play vars 13131 1726867225.51114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867225.51290: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867225.51329: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867225.51365: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867225.51395: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867225.51436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867225.51582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867225.51585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867225.51588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867225.51590: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867225.51839: variable 'network_connections' from source: task vars 13131 1726867225.51842: variable 'port1_profile' from source: play vars 13131 1726867225.51913: variable 'port1_profile' from source: play vars 13131 1726867225.51921: variable 'port2_profile' from source: play vars 13131 1726867225.51981: variable 'port2_profile' from source: play vars 13131 1726867225.52018: Evaluated conditional (__network_wpa_supplicant_required): False 13131 1726867225.52021: when evaluation is False, skipping this task 13131 1726867225.52031: _execute() done 13131 1726867225.52034: dumping result to json 13131 1726867225.52037: done dumping result, returning 13131 1726867225.52039: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-5f24-9b7a-000000000128] 13131 1726867225.52041: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000128 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13131 1726867225.52174: no more pending results, returning what we have 13131 1726867225.52179: results queue empty 13131 1726867225.52180: checking for any_errors_fatal 13131 1726867225.52201: done checking for any_errors_fatal 13131 1726867225.52205: checking for max_fail_percentage 13131 1726867225.52207: done checking for max_fail_percentage 13131 1726867225.52208: checking to see if all hosts have failed and the running result is not ok 13131 1726867225.52208: done checking to see if all hosts have failed 13131 1726867225.52209: getting the remaining hosts for this loop 13131 1726867225.52210: done getting the remaining hosts for this loop 13131 1726867225.52218: getting the next task for host managed_node1 13131 1726867225.52225: done getting next task for host managed_node1 13131 1726867225.52228: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867225.52232: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867225.52252: getting variables 13131 1726867225.52253: in VariableManager get_vars() 13131 1726867225.52310: Calling all_inventory to load vars for managed_node1 13131 1726867225.52313: Calling groups_inventory to load vars for managed_node1 13131 1726867225.52316: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867225.52498: Calling all_plugins_play to load vars for managed_node1 13131 1726867225.52505: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867225.52510: Calling groups_plugins_play to load vars for managed_node1 13131 1726867225.53032: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000128 13131 1726867225.53035: WORKER PROCESS EXITING 13131 1726867225.53905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867225.55471: done with get_vars() 13131 1726867225.55505: done getting variables 13131 1726867225.55563: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:20:25 -0400 (0:00:00.095) 0:00:40.666 ****** 13131 1726867225.55609: entering _queue_task() for managed_node1/service 13131 1726867225.55940: worker is 1 (out of 1 available) 13131 1726867225.55953: exiting _queue_task() for managed_node1/service 13131 1726867225.55963: done queuing things up, now waiting for results queue to drain 13131 1726867225.55965: waiting for pending results... 13131 1726867225.56294: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867225.56343: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000129 13131 1726867225.56365: variable 'ansible_search_path' from source: unknown 13131 1726867225.56375: variable 'ansible_search_path' from source: unknown 13131 1726867225.56420: calling self._execute() 13131 1726867225.56520: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867225.56534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867225.56550: variable 'omit' from source: magic vars 13131 1726867225.56918: variable 'ansible_distribution_major_version' from source: facts 13131 1726867225.56935: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867225.57046: variable 'network_provider' from source: set_fact 13131 1726867225.57058: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867225.57066: when evaluation is False, skipping this task 13131 1726867225.57073: _execute() done 13131 1726867225.57083: dumping result to json 13131 1726867225.57091: done dumping result, returning 13131 1726867225.57284: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-5f24-9b7a-000000000129] 13131 1726867225.57288: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000129 13131 1726867225.57353: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000129 13131 1726867225.57357: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867225.57395: no more pending results, returning what we have 13131 1726867225.57398: results queue empty 13131 1726867225.57399: checking for any_errors_fatal 13131 1726867225.57407: done checking for any_errors_fatal 13131 1726867225.57408: checking for max_fail_percentage 13131 1726867225.57409: done checking for max_fail_percentage 13131 1726867225.57410: checking to see if all hosts have failed and the running result is not ok 13131 1726867225.57411: done checking to see if all hosts have failed 13131 1726867225.57411: getting the remaining hosts for this loop 13131 1726867225.57412: done getting the remaining hosts for this loop 13131 1726867225.57415: getting the next task for host managed_node1 13131 1726867225.57420: done getting next task for host managed_node1 13131 1726867225.57423: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867225.57426: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867225.57442: getting variables 13131 1726867225.57443: in VariableManager get_vars() 13131 1726867225.57513: Calling all_inventory to load vars for managed_node1 13131 1726867225.57516: Calling groups_inventory to load vars for managed_node1 13131 1726867225.57518: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867225.57527: Calling all_plugins_play to load vars for managed_node1 13131 1726867225.57530: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867225.57533: Calling groups_plugins_play to load vars for managed_node1 13131 1726867225.58803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867225.59654: done with get_vars() 13131 1726867225.59668: done getting variables 13131 1726867225.59708: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:20:25 -0400 (0:00:00.041) 0:00:40.707 ****** 13131 1726867225.59730: entering _queue_task() for managed_node1/copy 13131 1726867225.59925: worker is 1 (out of 1 available) 13131 1726867225.59939: exiting _queue_task() for managed_node1/copy 13131 1726867225.59949: done queuing things up, now waiting for results queue to drain 13131 1726867225.59950: waiting for pending results... 13131 1726867225.60131: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867225.60250: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000012a 13131 1726867225.60272: variable 'ansible_search_path' from source: unknown 13131 1726867225.60279: variable 'ansible_search_path' from source: unknown 13131 1726867225.60323: calling self._execute() 13131 1726867225.60489: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867225.60492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867225.60495: variable 'omit' from source: magic vars 13131 1726867225.60828: variable 'ansible_distribution_major_version' from source: facts 13131 1726867225.60847: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867225.60968: variable 'network_provider' from source: set_fact 13131 1726867225.60986: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867225.60995: when evaluation is False, skipping this task 13131 1726867225.61001: _execute() done 13131 1726867225.61008: dumping result to json 13131 1726867225.61015: done dumping result, returning 13131 1726867225.61035: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-5f24-9b7a-00000000012a] 13131 1726867225.61046: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012a skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867225.61291: no more pending results, returning what we have 13131 1726867225.61295: results queue empty 13131 1726867225.61296: checking for any_errors_fatal 13131 1726867225.61303: done checking for any_errors_fatal 13131 1726867225.61304: checking for max_fail_percentage 13131 1726867225.61306: done checking for max_fail_percentage 13131 1726867225.61307: checking to see if all hosts have failed and the running result is not ok 13131 1726867225.61307: done checking to see if all hosts have failed 13131 1726867225.61308: getting the remaining hosts for this loop 13131 1726867225.61309: done getting the remaining hosts for this loop 13131 1726867225.61313: getting the next task for host managed_node1 13131 1726867225.61319: done getting next task for host managed_node1 13131 1726867225.61323: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867225.61327: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867225.61348: getting variables 13131 1726867225.61350: in VariableManager get_vars() 13131 1726867225.61409: Calling all_inventory to load vars for managed_node1 13131 1726867225.61412: Calling groups_inventory to load vars for managed_node1 13131 1726867225.61414: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867225.61424: Calling all_plugins_play to load vars for managed_node1 13131 1726867225.61427: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867225.61430: Calling groups_plugins_play to load vars for managed_node1 13131 1726867225.61990: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012a 13131 1726867225.61993: WORKER PROCESS EXITING 13131 1726867225.62680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867225.63599: done with get_vars() 13131 1726867225.63621: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:20:25 -0400 (0:00:00.039) 0:00:40.747 ****** 13131 1726867225.63705: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867225.63956: worker is 1 (out of 1 available) 13131 1726867225.63970: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867225.63983: done queuing things up, now waiting for results queue to drain 13131 1726867225.63985: waiting for pending results... 13131 1726867225.64273: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867225.64420: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000012b 13131 1726867225.64440: variable 'ansible_search_path' from source: unknown 13131 1726867225.64444: variable 'ansible_search_path' from source: unknown 13131 1726867225.64486: calling self._execute() 13131 1726867225.64594: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867225.64614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867225.64631: variable 'omit' from source: magic vars 13131 1726867225.65287: variable 'ansible_distribution_major_version' from source: facts 13131 1726867225.65291: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867225.65293: variable 'omit' from source: magic vars 13131 1726867225.65296: variable 'omit' from source: magic vars 13131 1726867225.65481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867225.68079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867225.68147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867225.68186: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867225.68228: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867225.68256: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867225.68342: variable 'network_provider' from source: set_fact 13131 1726867225.68470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867225.68504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867225.68538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867225.68589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867225.68612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867225.68692: variable 'omit' from source: magic vars 13131 1726867225.68812: variable 'omit' from source: magic vars 13131 1726867225.68924: variable 'network_connections' from source: task vars 13131 1726867225.68940: variable 'port1_profile' from source: play vars 13131 1726867225.69009: variable 'port1_profile' from source: play vars 13131 1726867225.69022: variable 'port2_profile' from source: play vars 13131 1726867225.69098: variable 'port2_profile' from source: play vars 13131 1726867225.69253: variable 'omit' from source: magic vars 13131 1726867225.69266: variable '__lsr_ansible_managed' from source: task vars 13131 1726867225.69341: variable '__lsr_ansible_managed' from source: task vars 13131 1726867225.69542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13131 1726867225.69760: Loaded config def from plugin (lookup/template) 13131 1726867225.69836: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13131 1726867225.69840: File lookup term: get_ansible_managed.j2 13131 1726867225.69842: variable 'ansible_search_path' from source: unknown 13131 1726867225.69844: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13131 1726867225.69848: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13131 1726867225.69862: variable 'ansible_search_path' from source: unknown 13131 1726867225.78162: variable 'ansible_managed' from source: unknown 13131 1726867225.78420: variable 'omit' from source: magic vars 13131 1726867225.78424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867225.78427: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867225.78430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867225.78432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867225.78434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867225.78436: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867225.78438: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867225.78440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867225.78528: Set connection var ansible_connection to ssh 13131 1726867225.78533: Set connection var ansible_timeout to 10 13131 1726867225.78536: Set connection var ansible_shell_type to sh 13131 1726867225.78545: Set connection var ansible_shell_executable to /bin/sh 13131 1726867225.78554: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867225.78560: Set connection var ansible_pipelining to False 13131 1726867225.78584: variable 'ansible_shell_executable' from source: unknown 13131 1726867225.78587: variable 'ansible_connection' from source: unknown 13131 1726867225.78589: variable 'ansible_module_compression' from source: unknown 13131 1726867225.78592: variable 'ansible_shell_type' from source: unknown 13131 1726867225.78595: variable 'ansible_shell_executable' from source: unknown 13131 1726867225.78597: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867225.78599: variable 'ansible_pipelining' from source: unknown 13131 1726867225.78601: variable 'ansible_timeout' from source: unknown 13131 1726867225.78608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867225.78784: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867225.78795: variable 'omit' from source: magic vars 13131 1726867225.78798: starting attempt loop 13131 1726867225.78800: running the handler 13131 1726867225.78805: _low_level_execute_command(): starting 13131 1726867225.78808: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867225.79734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.79770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867225.79784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867225.79822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867225.79871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867225.81545: stdout chunk (state=3): >>>/root <<< 13131 1726867225.81690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867225.81693: stdout chunk (state=3): >>><<< 13131 1726867225.81701: stderr chunk (state=3): >>><<< 13131 1726867225.81723: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867225.81804: _low_level_execute_command(): starting 13131 1726867225.81808: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628 `" && echo ansible-tmp-1726867225.8172855-15062-251115190123628="` echo /root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628 `" ) && sleep 0' 13131 1726867225.82282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867225.82295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.82314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.82363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867225.82382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867225.82427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867225.84289: stdout chunk (state=3): >>>ansible-tmp-1726867225.8172855-15062-251115190123628=/root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628 <<< 13131 1726867225.84436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867225.84439: stdout chunk (state=3): >>><<< 13131 1726867225.84441: stderr chunk (state=3): >>><<< 13131 1726867225.84598: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867225.8172855-15062-251115190123628=/root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867225.84605: variable 'ansible_module_compression' from source: unknown 13131 1726867225.84607: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13131 1726867225.84609: variable 'ansible_facts' from source: unknown 13131 1726867225.84724: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/AnsiballZ_network_connections.py 13131 1726867225.84851: Sending initial data 13131 1726867225.84949: Sent initial data (168 bytes) 13131 1726867225.85473: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.85517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867225.85523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867225.85564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867225.87119: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867225.87199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867225.87280: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmplk51m9jt /root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/AnsiballZ_network_connections.py <<< 13131 1726867225.87284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/AnsiballZ_network_connections.py" <<< 13131 1726867225.87334: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmplk51m9jt" to remote "/root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/AnsiballZ_network_connections.py" <<< 13131 1726867225.88256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867225.88279: stderr chunk (state=3): >>><<< 13131 1726867225.88282: stdout chunk (state=3): >>><<< 13131 1726867225.88306: done transferring module to remote 13131 1726867225.88312: _low_level_execute_command(): starting 13131 1726867225.88316: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/ /root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/AnsiballZ_network_connections.py && sleep 0' 13131 1726867225.88717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867225.88720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.88722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867225.88724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867225.88726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.88757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867225.88770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867225.88819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867225.90522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867225.90546: stderr chunk (state=3): >>><<< 13131 1726867225.90549: stdout chunk (state=3): >>><<< 13131 1726867225.90561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867225.90564: _low_level_execute_command(): starting 13131 1726867225.90569: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/AnsiballZ_network_connections.py && sleep 0' 13131 1726867225.90944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867225.90985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867225.90988: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867225.90991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.90993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867225.90995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867225.90997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867225.91033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867225.91037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867225.91096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867226.31252: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hash83iq/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hash83iq/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a: error=unknown <<< 13131 1726867226.33092: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hash83iq/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hash83iq/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/ca2e10a6-bdb2-4703-8f7f-0fc9be649723: error=unknown <<< 13131 1726867226.33258: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13131 1726867226.35282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867226.35286: stdout chunk (state=3): >>><<< 13131 1726867226.35288: stderr chunk (state=3): >>><<< 13131 1726867226.35291: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hash83iq/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hash83iq/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/14cb0775-cfe5-4c9d-86f0-7deaf75fbb1a: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hash83iq/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hash83iq/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/ca2e10a6-bdb2-4703-8f7f-0fc9be649723: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867226.35294: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867226.35296: _low_level_execute_command(): starting 13131 1726867226.35298: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867225.8172855-15062-251115190123628/ > /dev/null 2>&1 && sleep 0' 13131 1726867226.35984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867226.35987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867226.35990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867226.35992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867226.35995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867226.35997: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867226.36009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867226.36024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867226.36059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867226.36128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867226.36150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867226.36234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867226.38283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867226.38287: stdout chunk (state=3): >>><<< 13131 1726867226.38289: stderr chunk (state=3): >>><<< 13131 1726867226.38292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867226.38294: handler run complete 13131 1726867226.38296: attempt loop complete, returning result 13131 1726867226.38298: _execute() done 13131 1726867226.38300: dumping result to json 13131 1726867226.38301: done dumping result, returning 13131 1726867226.38303: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-5f24-9b7a-00000000012b] 13131 1726867226.38305: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012b 13131 1726867226.38370: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012b 13131 1726867226.38373: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13131 1726867226.38505: no more pending results, returning what we have 13131 1726867226.38508: results queue empty 13131 1726867226.38509: checking for any_errors_fatal 13131 1726867226.38514: done checking for any_errors_fatal 13131 1726867226.38515: checking for max_fail_percentage 13131 1726867226.38516: done checking for max_fail_percentage 13131 1726867226.38517: checking to see if all hosts have failed and the running result is not ok 13131 1726867226.38518: done checking to see if all hosts have failed 13131 1726867226.38518: getting the remaining hosts for this loop 13131 1726867226.38520: done getting the remaining hosts for this loop 13131 1726867226.38523: getting the next task for host managed_node1 13131 1726867226.38528: done getting next task for host managed_node1 13131 1726867226.38532: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867226.38535: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867226.38545: getting variables 13131 1726867226.38546: in VariableManager get_vars() 13131 1726867226.38705: Calling all_inventory to load vars for managed_node1 13131 1726867226.38709: Calling groups_inventory to load vars for managed_node1 13131 1726867226.38712: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867226.38722: Calling all_plugins_play to load vars for managed_node1 13131 1726867226.38726: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867226.38730: Calling groups_plugins_play to load vars for managed_node1 13131 1726867226.40307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867226.42616: done with get_vars() 13131 1726867226.42641: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:26 -0400 (0:00:00.790) 0:00:41.537 ****** 13131 1726867226.42736: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867226.43080: worker is 1 (out of 1 available) 13131 1726867226.43094: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867226.43107: done queuing things up, now waiting for results queue to drain 13131 1726867226.43109: waiting for pending results... 13131 1726867226.43385: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867226.43587: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000012c 13131 1726867226.43591: variable 'ansible_search_path' from source: unknown 13131 1726867226.43594: variable 'ansible_search_path' from source: unknown 13131 1726867226.43597: calling self._execute() 13131 1726867226.43647: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.43660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.43676: variable 'omit' from source: magic vars 13131 1726867226.44053: variable 'ansible_distribution_major_version' from source: facts 13131 1726867226.44070: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867226.44194: variable 'network_state' from source: role '' defaults 13131 1726867226.44209: Evaluated conditional (network_state != {}): False 13131 1726867226.44216: when evaluation is False, skipping this task 13131 1726867226.44223: _execute() done 13131 1726867226.44234: dumping result to json 13131 1726867226.44242: done dumping result, returning 13131 1726867226.44252: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-5f24-9b7a-00000000012c] 13131 1726867226.44262: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012c 13131 1726867226.44471: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012c 13131 1726867226.44474: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867226.44539: no more pending results, returning what we have 13131 1726867226.44543: results queue empty 13131 1726867226.44544: checking for any_errors_fatal 13131 1726867226.44556: done checking for any_errors_fatal 13131 1726867226.44556: checking for max_fail_percentage 13131 1726867226.44559: done checking for max_fail_percentage 13131 1726867226.44560: checking to see if all hosts have failed and the running result is not ok 13131 1726867226.44560: done checking to see if all hosts have failed 13131 1726867226.44561: getting the remaining hosts for this loop 13131 1726867226.44562: done getting the remaining hosts for this loop 13131 1726867226.44566: getting the next task for host managed_node1 13131 1726867226.44572: done getting next task for host managed_node1 13131 1726867226.44578: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867226.44583: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867226.44607: getting variables 13131 1726867226.44609: in VariableManager get_vars() 13131 1726867226.44661: Calling all_inventory to load vars for managed_node1 13131 1726867226.44664: Calling groups_inventory to load vars for managed_node1 13131 1726867226.44666: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867226.44880: Calling all_plugins_play to load vars for managed_node1 13131 1726867226.44885: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867226.44888: Calling groups_plugins_play to load vars for managed_node1 13131 1726867226.47348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867226.50922: done with get_vars() 13131 1726867226.50943: done getting variables 13131 1726867226.51001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:26 -0400 (0:00:00.082) 0:00:41.620 ****** 13131 1726867226.51036: entering _queue_task() for managed_node1/debug 13131 1726867226.51766: worker is 1 (out of 1 available) 13131 1726867226.51779: exiting _queue_task() for managed_node1/debug 13131 1726867226.51792: done queuing things up, now waiting for results queue to drain 13131 1726867226.51794: waiting for pending results... 13131 1726867226.52393: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867226.52398: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000012d 13131 1726867226.52593: variable 'ansible_search_path' from source: unknown 13131 1726867226.52601: variable 'ansible_search_path' from source: unknown 13131 1726867226.52642: calling self._execute() 13131 1726867226.53082: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.53086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.53090: variable 'omit' from source: magic vars 13131 1726867226.53540: variable 'ansible_distribution_major_version' from source: facts 13131 1726867226.53882: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867226.53886: variable 'omit' from source: magic vars 13131 1726867226.54082: variable 'omit' from source: magic vars 13131 1726867226.54086: variable 'omit' from source: magic vars 13131 1726867226.54088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867226.54091: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867226.54093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867226.54095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867226.54097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867226.54099: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867226.54101: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.54103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.54341: Set connection var ansible_connection to ssh 13131 1726867226.54354: Set connection var ansible_timeout to 10 13131 1726867226.54361: Set connection var ansible_shell_type to sh 13131 1726867226.54373: Set connection var ansible_shell_executable to /bin/sh 13131 1726867226.54389: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867226.54398: Set connection var ansible_pipelining to False 13131 1726867226.54682: variable 'ansible_shell_executable' from source: unknown 13131 1726867226.54685: variable 'ansible_connection' from source: unknown 13131 1726867226.54688: variable 'ansible_module_compression' from source: unknown 13131 1726867226.54690: variable 'ansible_shell_type' from source: unknown 13131 1726867226.54692: variable 'ansible_shell_executable' from source: unknown 13131 1726867226.54694: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.54697: variable 'ansible_pipelining' from source: unknown 13131 1726867226.54699: variable 'ansible_timeout' from source: unknown 13131 1726867226.54701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.54781: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867226.54802: variable 'omit' from source: magic vars 13131 1726867226.54866: starting attempt loop 13131 1726867226.54871: running the handler 13131 1726867226.55002: variable '__network_connections_result' from source: set_fact 13131 1726867226.55054: handler run complete 13131 1726867226.55070: attempt loop complete, returning result 13131 1726867226.55073: _execute() done 13131 1726867226.55076: dumping result to json 13131 1726867226.55081: done dumping result, returning 13131 1726867226.55310: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-5f24-9b7a-00000000012d] 13131 1726867226.55314: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012d ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 13131 1726867226.55459: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012d 13131 1726867226.55465: WORKER PROCESS EXITING 13131 1726867226.55480: no more pending results, returning what we have 13131 1726867226.55484: results queue empty 13131 1726867226.55485: checking for any_errors_fatal 13131 1726867226.55490: done checking for any_errors_fatal 13131 1726867226.55490: checking for max_fail_percentage 13131 1726867226.55492: done checking for max_fail_percentage 13131 1726867226.55493: checking to see if all hosts have failed and the running result is not ok 13131 1726867226.55494: done checking to see if all hosts have failed 13131 1726867226.55494: getting the remaining hosts for this loop 13131 1726867226.55496: done getting the remaining hosts for this loop 13131 1726867226.55499: getting the next task for host managed_node1 13131 1726867226.55507: done getting next task for host managed_node1 13131 1726867226.55512: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867226.55515: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867226.55527: getting variables 13131 1726867226.55528: in VariableManager get_vars() 13131 1726867226.55582: Calling all_inventory to load vars for managed_node1 13131 1726867226.55586: Calling groups_inventory to load vars for managed_node1 13131 1726867226.55589: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867226.55598: Calling all_plugins_play to load vars for managed_node1 13131 1726867226.55601: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867226.55606: Calling groups_plugins_play to load vars for managed_node1 13131 1726867226.58362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867226.61542: done with get_vars() 13131 1726867226.61567: done getting variables 13131 1726867226.61834: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:26 -0400 (0:00:00.108) 0:00:41.729 ****** 13131 1726867226.61869: entering _queue_task() for managed_node1/debug 13131 1726867226.62622: worker is 1 (out of 1 available) 13131 1726867226.62633: exiting _queue_task() for managed_node1/debug 13131 1726867226.62645: done queuing things up, now waiting for results queue to drain 13131 1726867226.62646: waiting for pending results... 13131 1726867226.62982: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867226.63483: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000012e 13131 1726867226.63487: variable 'ansible_search_path' from source: unknown 13131 1726867226.63490: variable 'ansible_search_path' from source: unknown 13131 1726867226.63492: calling self._execute() 13131 1726867226.63495: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.63883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.63886: variable 'omit' from source: magic vars 13131 1726867226.64241: variable 'ansible_distribution_major_version' from source: facts 13131 1726867226.64682: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867226.64686: variable 'omit' from source: magic vars 13131 1726867226.64688: variable 'omit' from source: magic vars 13131 1726867226.64690: variable 'omit' from source: magic vars 13131 1726867226.64693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867226.64695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867226.64697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867226.64901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867226.64918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867226.64956: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867226.64966: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.64975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.65073: Set connection var ansible_connection to ssh 13131 1726867226.65083: Set connection var ansible_timeout to 10 13131 1726867226.65386: Set connection var ansible_shell_type to sh 13131 1726867226.65390: Set connection var ansible_shell_executable to /bin/sh 13131 1726867226.65392: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867226.65394: Set connection var ansible_pipelining to False 13131 1726867226.65397: variable 'ansible_shell_executable' from source: unknown 13131 1726867226.65400: variable 'ansible_connection' from source: unknown 13131 1726867226.65402: variable 'ansible_module_compression' from source: unknown 13131 1726867226.65404: variable 'ansible_shell_type' from source: unknown 13131 1726867226.65406: variable 'ansible_shell_executable' from source: unknown 13131 1726867226.65408: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.65409: variable 'ansible_pipelining' from source: unknown 13131 1726867226.65411: variable 'ansible_timeout' from source: unknown 13131 1726867226.65413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.65523: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867226.65695: variable 'omit' from source: magic vars 13131 1726867226.65708: starting attempt loop 13131 1726867226.65715: running the handler 13131 1726867226.65767: variable '__network_connections_result' from source: set_fact 13131 1726867226.65850: variable '__network_connections_result' from source: set_fact 13131 1726867226.66196: handler run complete 13131 1726867226.66226: attempt loop complete, returning result 13131 1726867226.66233: _execute() done 13131 1726867226.66240: dumping result to json 13131 1726867226.66248: done dumping result, returning 13131 1726867226.66260: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-5f24-9b7a-00000000012e] 13131 1726867226.66268: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012e ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13131 1726867226.66491: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012e 13131 1726867226.66494: WORKER PROCESS EXITING 13131 1726867226.66513: no more pending results, returning what we have 13131 1726867226.66517: results queue empty 13131 1726867226.66518: checking for any_errors_fatal 13131 1726867226.66524: done checking for any_errors_fatal 13131 1726867226.66524: checking for max_fail_percentage 13131 1726867226.66526: done checking for max_fail_percentage 13131 1726867226.66527: checking to see if all hosts have failed and the running result is not ok 13131 1726867226.66527: done checking to see if all hosts have failed 13131 1726867226.66528: getting the remaining hosts for this loop 13131 1726867226.66529: done getting the remaining hosts for this loop 13131 1726867226.66532: getting the next task for host managed_node1 13131 1726867226.66538: done getting next task for host managed_node1 13131 1726867226.66542: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867226.66545: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867226.66555: getting variables 13131 1726867226.66557: in VariableManager get_vars() 13131 1726867226.66811: Calling all_inventory to load vars for managed_node1 13131 1726867226.66814: Calling groups_inventory to load vars for managed_node1 13131 1726867226.66816: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867226.66825: Calling all_plugins_play to load vars for managed_node1 13131 1726867226.66827: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867226.66830: Calling groups_plugins_play to load vars for managed_node1 13131 1726867226.69991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867226.73066: done with get_vars() 13131 1726867226.73096: done getting variables 13131 1726867226.73161: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:26 -0400 (0:00:00.115) 0:00:41.844 ****** 13131 1726867226.73415: entering _queue_task() for managed_node1/debug 13131 1726867226.74117: worker is 1 (out of 1 available) 13131 1726867226.74126: exiting _queue_task() for managed_node1/debug 13131 1726867226.74136: done queuing things up, now waiting for results queue to drain 13131 1726867226.74137: waiting for pending results... 13131 1726867226.74274: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867226.74622: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000012f 13131 1726867226.74710: variable 'ansible_search_path' from source: unknown 13131 1726867226.74720: variable 'ansible_search_path' from source: unknown 13131 1726867226.74760: calling self._execute() 13131 1726867226.74870: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.74875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.74901: variable 'omit' from source: magic vars 13131 1726867226.75383: variable 'ansible_distribution_major_version' from source: facts 13131 1726867226.75387: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867226.75486: variable 'network_state' from source: role '' defaults 13131 1726867226.75525: Evaluated conditional (network_state != {}): False 13131 1726867226.75533: when evaluation is False, skipping this task 13131 1726867226.75540: _execute() done 13131 1726867226.75550: dumping result to json 13131 1726867226.75558: done dumping result, returning 13131 1726867226.75570: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-5f24-9b7a-00000000012f] 13131 1726867226.75629: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012f 13131 1726867226.75701: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000012f 13131 1726867226.75707: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 13131 1726867226.75753: no more pending results, returning what we have 13131 1726867226.75758: results queue empty 13131 1726867226.75759: checking for any_errors_fatal 13131 1726867226.75770: done checking for any_errors_fatal 13131 1726867226.75771: checking for max_fail_percentage 13131 1726867226.75773: done checking for max_fail_percentage 13131 1726867226.75774: checking to see if all hosts have failed and the running result is not ok 13131 1726867226.75775: done checking to see if all hosts have failed 13131 1726867226.75776: getting the remaining hosts for this loop 13131 1726867226.75802: done getting the remaining hosts for this loop 13131 1726867226.75808: getting the next task for host managed_node1 13131 1726867226.75816: done getting next task for host managed_node1 13131 1726867226.75821: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867226.75825: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867226.75846: getting variables 13131 1726867226.75847: in VariableManager get_vars() 13131 1726867226.75899: Calling all_inventory to load vars for managed_node1 13131 1726867226.75902: Calling groups_inventory to load vars for managed_node1 13131 1726867226.75904: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867226.75914: Calling all_plugins_play to load vars for managed_node1 13131 1726867226.75917: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867226.75919: Calling groups_plugins_play to load vars for managed_node1 13131 1726867226.77596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867226.78950: done with get_vars() 13131 1726867226.78965: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:26 -0400 (0:00:00.056) 0:00:41.900 ****** 13131 1726867226.79037: entering _queue_task() for managed_node1/ping 13131 1726867226.79254: worker is 1 (out of 1 available) 13131 1726867226.79268: exiting _queue_task() for managed_node1/ping 13131 1726867226.79280: done queuing things up, now waiting for results queue to drain 13131 1726867226.79282: waiting for pending results... 13131 1726867226.79454: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867226.79551: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000130 13131 1726867226.79564: variable 'ansible_search_path' from source: unknown 13131 1726867226.79567: variable 'ansible_search_path' from source: unknown 13131 1726867226.79597: calling self._execute() 13131 1726867226.79669: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.79675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.79685: variable 'omit' from source: magic vars 13131 1726867226.79948: variable 'ansible_distribution_major_version' from source: facts 13131 1726867226.79960: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867226.79965: variable 'omit' from source: magic vars 13131 1726867226.80005: variable 'omit' from source: magic vars 13131 1726867226.80033: variable 'omit' from source: magic vars 13131 1726867226.80066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867226.80094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867226.80112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867226.80125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867226.80135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867226.80160: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867226.80163: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.80165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.80235: Set connection var ansible_connection to ssh 13131 1726867226.80242: Set connection var ansible_timeout to 10 13131 1726867226.80244: Set connection var ansible_shell_type to sh 13131 1726867226.80251: Set connection var ansible_shell_executable to /bin/sh 13131 1726867226.80258: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867226.80263: Set connection var ansible_pipelining to False 13131 1726867226.80336: variable 'ansible_shell_executable' from source: unknown 13131 1726867226.80338: variable 'ansible_connection' from source: unknown 13131 1726867226.80340: variable 'ansible_module_compression' from source: unknown 13131 1726867226.80341: variable 'ansible_shell_type' from source: unknown 13131 1726867226.80344: variable 'ansible_shell_executable' from source: unknown 13131 1726867226.80346: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867226.80348: variable 'ansible_pipelining' from source: unknown 13131 1726867226.80350: variable 'ansible_timeout' from source: unknown 13131 1726867226.80352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867226.80834: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867226.80839: variable 'omit' from source: magic vars 13131 1726867226.80841: starting attempt loop 13131 1726867226.80843: running the handler 13131 1726867226.80845: _low_level_execute_command(): starting 13131 1726867226.80847: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867226.82065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867226.82115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867226.82135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867226.82159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867226.82246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867226.84046: stdout chunk (state=3): >>>/root <<< 13131 1726867226.84124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867226.84139: stdout chunk (state=3): >>><<< 13131 1726867226.84176: stderr chunk (state=3): >>><<< 13131 1726867226.84314: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867226.84338: _low_level_execute_command(): starting 13131 1726867226.84387: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410 `" && echo ansible-tmp-1726867226.8432264-15120-249570284283410="` echo /root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410 `" ) && sleep 0' 13131 1726867226.85363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867226.85369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867226.85386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867226.85406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867226.85425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867226.85438: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867226.85531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867226.85556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867226.85572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867226.85656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867226.87513: stdout chunk (state=3): >>>ansible-tmp-1726867226.8432264-15120-249570284283410=/root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410 <<< 13131 1726867226.87650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867226.87670: stdout chunk (state=3): >>><<< 13131 1726867226.87688: stderr chunk (state=3): >>><<< 13131 1726867226.87708: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867226.8432264-15120-249570284283410=/root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867226.87783: variable 'ansible_module_compression' from source: unknown 13131 1726867226.87808: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13131 1726867226.87843: variable 'ansible_facts' from source: unknown 13131 1726867226.88016: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/AnsiballZ_ping.py 13131 1726867226.88175: Sending initial data 13131 1726867226.88220: Sent initial data (153 bytes) 13131 1726867226.89274: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867226.89461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867226.89464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867226.89467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867226.91007: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13131 1726867226.91022: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13131 1726867226.91035: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13131 1726867226.91047: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 13131 1726867226.91070: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867226.91141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867226.91207: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpr8vdnrpf /root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/AnsiballZ_ping.py <<< 13131 1726867226.91228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/AnsiballZ_ping.py" <<< 13131 1726867226.91268: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpr8vdnrpf" to remote "/root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/AnsiballZ_ping.py" <<< 13131 1726867226.91995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867226.92149: stderr chunk (state=3): >>><<< 13131 1726867226.92152: stdout chunk (state=3): >>><<< 13131 1726867226.92157: done transferring module to remote 13131 1726867226.92159: _low_level_execute_command(): starting 13131 1726867226.92162: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/ /root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/AnsiballZ_ping.py && sleep 0' 13131 1726867226.92710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867226.92729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867226.92746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867226.92835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867226.92870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867226.92895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867226.92915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867226.93004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867226.94749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867226.94791: stderr chunk (state=3): >>><<< 13131 1726867226.94883: stdout chunk (state=3): >>><<< 13131 1726867226.94887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867226.94890: _low_level_execute_command(): starting 13131 1726867226.94892: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/AnsiballZ_ping.py && sleep 0' 13131 1726867226.95433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867226.95443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867226.95454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867226.95499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867226.95511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867226.95592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867226.95681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867227.10501: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13131 1726867227.12184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867227.12188: stdout chunk (state=3): >>><<< 13131 1726867227.12192: stderr chunk (state=3): >>><<< 13131 1726867227.12195: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867227.12197: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867227.12199: _low_level_execute_command(): starting 13131 1726867227.12201: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867226.8432264-15120-249570284283410/ > /dev/null 2>&1 && sleep 0' 13131 1726867227.13348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867227.13369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867227.13397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867227.13455: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867227.13556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867227.13690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867227.13738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867227.15605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867227.15609: stdout chunk (state=3): >>><<< 13131 1726867227.15611: stderr chunk (state=3): >>><<< 13131 1726867227.15627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867227.15642: handler run complete 13131 1726867227.15665: attempt loop complete, returning result 13131 1726867227.15985: _execute() done 13131 1726867227.15989: dumping result to json 13131 1726867227.15991: done dumping result, returning 13131 1726867227.15993: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-5f24-9b7a-000000000130] 13131 1726867227.15995: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000130 13131 1726867227.16065: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000130 13131 1726867227.16068: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 13131 1726867227.16155: no more pending results, returning what we have 13131 1726867227.16158: results queue empty 13131 1726867227.16159: checking for any_errors_fatal 13131 1726867227.16165: done checking for any_errors_fatal 13131 1726867227.16166: checking for max_fail_percentage 13131 1726867227.16167: done checking for max_fail_percentage 13131 1726867227.16168: checking to see if all hosts have failed and the running result is not ok 13131 1726867227.16169: done checking to see if all hosts have failed 13131 1726867227.16170: getting the remaining hosts for this loop 13131 1726867227.16171: done getting the remaining hosts for this loop 13131 1726867227.16175: getting the next task for host managed_node1 13131 1726867227.16188: done getting next task for host managed_node1 13131 1726867227.16190: ^ task is: TASK: meta (role_complete) 13131 1726867227.16194: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867227.16209: getting variables 13131 1726867227.16210: in VariableManager get_vars() 13131 1726867227.16266: Calling all_inventory to load vars for managed_node1 13131 1726867227.16269: Calling groups_inventory to load vars for managed_node1 13131 1726867227.16272: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867227.16390: Calling all_plugins_play to load vars for managed_node1 13131 1726867227.16394: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867227.16398: Calling groups_plugins_play to load vars for managed_node1 13131 1726867227.23862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867227.25487: done with get_vars() 13131 1726867227.25511: done getting variables 13131 1726867227.25583: done queuing things up, now waiting for results queue to drain 13131 1726867227.25585: results queue empty 13131 1726867227.25586: checking for any_errors_fatal 13131 1726867227.25588: done checking for any_errors_fatal 13131 1726867227.25589: checking for max_fail_percentage 13131 1726867227.25590: done checking for max_fail_percentage 13131 1726867227.25591: checking to see if all hosts have failed and the running result is not ok 13131 1726867227.25591: done checking to see if all hosts have failed 13131 1726867227.25592: getting the remaining hosts for this loop 13131 1726867227.25593: done getting the remaining hosts for this loop 13131 1726867227.25596: getting the next task for host managed_node1 13131 1726867227.25599: done getting next task for host managed_node1 13131 1726867227.25602: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 13131 1726867227.25603: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867227.25605: getting variables 13131 1726867227.25606: in VariableManager get_vars() 13131 1726867227.25626: Calling all_inventory to load vars for managed_node1 13131 1726867227.25628: Calling groups_inventory to load vars for managed_node1 13131 1726867227.25631: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867227.25636: Calling all_plugins_play to load vars for managed_node1 13131 1726867227.25638: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867227.25641: Calling groups_plugins_play to load vars for managed_node1 13131 1726867227.27349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867227.28910: done with get_vars() 13131 1726867227.28928: done getting variables 13131 1726867227.28966: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867227.29061: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Friday 20 September 2024 17:20:27 -0400 (0:00:00.500) 0:00:42.401 ****** 13131 1726867227.29086: entering _queue_task() for managed_node1/command 13131 1726867227.29528: worker is 1 (out of 1 available) 13131 1726867227.29539: exiting _queue_task() for managed_node1/command 13131 1726867227.29548: done queuing things up, now waiting for results queue to drain 13131 1726867227.29549: waiting for pending results... 13131 1726867227.29766: running TaskExecutor() for managed_node1/TASK: From the active connection, get the controller profile "bond0" 13131 1726867227.29884: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000160 13131 1726867227.29911: variable 'ansible_search_path' from source: unknown 13131 1726867227.29951: calling self._execute() 13131 1726867227.30060: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.30074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.30092: variable 'omit' from source: magic vars 13131 1726867227.30574: variable 'ansible_distribution_major_version' from source: facts 13131 1726867227.30595: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867227.30719: variable 'network_provider' from source: set_fact 13131 1726867227.30731: Evaluated conditional (network_provider == "nm"): True 13131 1726867227.30742: variable 'omit' from source: magic vars 13131 1726867227.30768: variable 'omit' from source: magic vars 13131 1726867227.30865: variable 'controller_profile' from source: play vars 13131 1726867227.30891: variable 'omit' from source: magic vars 13131 1726867227.30934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867227.30971: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867227.30998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867227.31024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867227.31040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867227.31076: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867227.31089: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.31097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.31204: Set connection var ansible_connection to ssh 13131 1726867227.31217: Set connection var ansible_timeout to 10 13131 1726867227.31229: Set connection var ansible_shell_type to sh 13131 1726867227.31243: Set connection var ansible_shell_executable to /bin/sh 13131 1726867227.31258: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867227.31267: Set connection var ansible_pipelining to False 13131 1726867227.31294: variable 'ansible_shell_executable' from source: unknown 13131 1726867227.31336: variable 'ansible_connection' from source: unknown 13131 1726867227.31340: variable 'ansible_module_compression' from source: unknown 13131 1726867227.31342: variable 'ansible_shell_type' from source: unknown 13131 1726867227.31345: variable 'ansible_shell_executable' from source: unknown 13131 1726867227.31347: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.31349: variable 'ansible_pipelining' from source: unknown 13131 1726867227.31351: variable 'ansible_timeout' from source: unknown 13131 1726867227.31354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.31483: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867227.31553: variable 'omit' from source: magic vars 13131 1726867227.31556: starting attempt loop 13131 1726867227.31559: running the handler 13131 1726867227.31562: _low_level_execute_command(): starting 13131 1726867227.31564: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867227.32232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867227.32246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867227.32265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867227.32292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867227.32317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867227.32405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867227.32454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867227.32510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867227.34286: stdout chunk (state=3): >>>/root <<< 13131 1726867227.34333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867227.34344: stdout chunk (state=3): >>><<< 13131 1726867227.34360: stderr chunk (state=3): >>><<< 13131 1726867227.34392: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867227.34415: _low_level_execute_command(): starting 13131 1726867227.34429: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483 `" && echo ansible-tmp-1726867227.343983-15153-83267516440483="` echo /root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483 `" ) && sleep 0' 13131 1726867227.35132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867227.35211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867227.35214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867227.35242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867227.35261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867227.35295: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867227.35371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867227.35598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867227.35673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867227.37549: stdout chunk (state=3): >>>ansible-tmp-1726867227.343983-15153-83267516440483=/root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483 <<< 13131 1726867227.37686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867227.37697: stdout chunk (state=3): >>><<< 13131 1726867227.37716: stderr chunk (state=3): >>><<< 13131 1726867227.37742: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867227.343983-15153-83267516440483=/root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867227.37779: variable 'ansible_module_compression' from source: unknown 13131 1726867227.37843: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867227.37885: variable 'ansible_facts' from source: unknown 13131 1726867227.37988: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/AnsiballZ_command.py 13131 1726867227.38346: Sending initial data 13131 1726867227.38350: Sent initial data (154 bytes) 13131 1726867227.38915: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867227.38983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867227.39000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867227.39050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867227.39071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867227.39089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867227.39207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867227.40740: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13131 1726867227.40774: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867227.40811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867227.40866: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp4rs_w83j /root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/AnsiballZ_command.py <<< 13131 1726867227.40869: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/AnsiballZ_command.py" <<< 13131 1726867227.40914: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp4rs_w83j" to remote "/root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/AnsiballZ_command.py" <<< 13131 1726867227.41787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867227.41791: stdout chunk (state=3): >>><<< 13131 1726867227.41793: stderr chunk (state=3): >>><<< 13131 1726867227.41796: done transferring module to remote 13131 1726867227.41798: _low_level_execute_command(): starting 13131 1726867227.41800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/ /root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/AnsiballZ_command.py && sleep 0' 13131 1726867227.42350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867227.42367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867227.42384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867227.42474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867227.42529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867227.42596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867227.44392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867227.44401: stdout chunk (state=3): >>><<< 13131 1726867227.44416: stderr chunk (state=3): >>><<< 13131 1726867227.44437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867227.44452: _low_level_execute_command(): starting 13131 1726867227.44462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/AnsiballZ_command.py && sleep 0' 13131 1726867227.45109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867227.45167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867227.45185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867227.45235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867227.45291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867227.63854: stdout chunk (state=3): >>> <<< 13131 1726867227.63902: stdout chunk (state=3): >>>{"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 69e7ee46-007a-470e-9bdc-4928b4af57bb\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1726867220\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\ni<<< 13131 1726867227.63916: stdout chunk (state=3): >>>pv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 69e7ee46-007a-470e-9bdc-4928b4af57bb\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: no\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/22\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.50/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:b6:e9:1d:8b:19:45\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1726867460\nDHCP4.OPTION[7]: host_name = ip-10-31-12-57\nDHCP4.OPTION[8]: ip_address = 192.0.2.50\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::1a/128\nIP6.ADDRESS[2]: 2001:db8::b4e9:1dff:fe8b:1945/64\nIP6.ADDRESS[3]: fe80::b4e9:1dff:fe8b:1945/64\nIP6.GATEWAY: fe80::70e7:75ff:fe50:d635\nIP6.ROUTE[1]: dst = 2001:db8::1a/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::70e7:75ff:fe50:d635, mt = 300\nIP6.DNS[1]: 2001:db8::1c00:40ff:fe76:1d49\nIP6.DNS[2]: fe80::70e7:75ff:fe50:d635\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:fa:0d:c3:7c:4c:9f:5f:6f:52:aa:57:2c:11:45:67:7d\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::1c00:40ff:fe76:1d49\nDHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-12-57\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::1a", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-20 17:20:27.605174", "end": "2024-09-20 17:20:27.636619", "delta": "0:00:00.031445", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867227.65583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867227.65587: stdout chunk (state=3): >>><<< 13131 1726867227.65591: stderr chunk (state=3): >>><<< 13131 1726867227.65594: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 69e7ee46-007a-470e-9bdc-4928b4af57bb\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1726867220\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 69e7ee46-007a-470e-9bdc-4928b4af57bb\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: no\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/22\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.50/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:b6:e9:1d:8b:19:45\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1726867460\nDHCP4.OPTION[7]: host_name = ip-10-31-12-57\nDHCP4.OPTION[8]: ip_address = 192.0.2.50\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::1a/128\nIP6.ADDRESS[2]: 2001:db8::b4e9:1dff:fe8b:1945/64\nIP6.ADDRESS[3]: fe80::b4e9:1dff:fe8b:1945/64\nIP6.GATEWAY: fe80::70e7:75ff:fe50:d635\nIP6.ROUTE[1]: dst = 2001:db8::1a/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::70e7:75ff:fe50:d635, mt = 300\nIP6.DNS[1]: 2001:db8::1c00:40ff:fe76:1d49\nIP6.DNS[2]: fe80::70e7:75ff:fe50:d635\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:fa:0d:c3:7c:4c:9f:5f:6f:52:aa:57:2c:11:45:67:7d\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::1c00:40ff:fe76:1d49\nDHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-12-57\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::1a", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-20 17:20:27.605174", "end": "2024-09-20 17:20:27.636619", "delta": "0:00:00.031445", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867227.65651: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867227.65664: _low_level_execute_command(): starting 13131 1726867227.65673: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867227.343983-15153-83267516440483/ > /dev/null 2>&1 && sleep 0' 13131 1726867227.66317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867227.66332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867227.66388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867227.66465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867227.66497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867227.66572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867227.68403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867227.68418: stdout chunk (state=3): >>><<< 13131 1726867227.68429: stderr chunk (state=3): >>><<< 13131 1726867227.68456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867227.68548: handler run complete 13131 1726867227.68551: Evaluated conditional (False): False 13131 1726867227.68554: attempt loop complete, returning result 13131 1726867227.68555: _execute() done 13131 1726867227.68557: dumping result to json 13131 1726867227.68559: done dumping result, returning 13131 1726867227.68561: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the controller profile "bond0" [0affcac9-a3a5-5f24-9b7a-000000000160] 13131 1726867227.68563: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000160 ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0" ], "delta": "0:00:00.031445", "end": "2024-09-20 17:20:27.636619", "rc": 0, "start": "2024-09-20 17:20:27.605174" } STDOUT: connection.id: bond0 connection.uuid: 69e7ee46-007a-470e-9bdc-4928b4af57bb connection.stable-id: -- connection.type: bond connection.interface-name: nm-bond connection.autoconnect: yes connection.autoconnect-priority: 0 connection.autoconnect-retries: -1 (default) connection.multi-connect: 0 (default) connection.auth-retries: -1 connection.timestamp: 1726867220 connection.permissions: -- connection.zone: -- connection.controller: -- connection.master: -- connection.slave-type: -- connection.port-type: -- connection.autoconnect-slaves: -1 (default) connection.autoconnect-ports: -1 (default) connection.down-on-poweroff: -1 (default) connection.secondaries: -- connection.gateway-ping-timeout: 0 connection.metered: unknown connection.lldp: default connection.mdns: -1 (default) connection.llmnr: -1 (default) connection.dns-over-tls: -1 (default) connection.mptcp-flags: 0x0 (default) connection.wait-device-timeout: -1 connection.wait-activation-delay: -1 ipv4.method: auto ipv4.dns: -- ipv4.dns-search: -- ipv4.dns-options: -- ipv4.dns-priority: 0 ipv4.addresses: -- ipv4.gateway: -- ipv4.routes: -- ipv4.route-metric: 65535 ipv4.route-table: 0 (unspec) ipv4.routing-rules: -- ipv4.replace-local-rule: -1 (default) ipv4.dhcp-send-release: -1 (default) ipv4.ignore-auto-routes: no ipv4.ignore-auto-dns: no ipv4.dhcp-client-id: -- ipv4.dhcp-iaid: -- ipv4.dhcp-dscp: -- ipv4.dhcp-timeout: 0 (default) ipv4.dhcp-send-hostname: yes ipv4.dhcp-hostname: -- ipv4.dhcp-fqdn: -- ipv4.dhcp-hostname-flags: 0x0 (none) ipv4.never-default: no ipv4.may-fail: yes ipv4.required-timeout: -1 (default) ipv4.dad-timeout: -1 (default) ipv4.dhcp-vendor-class-identifier: -- ipv4.link-local: 0 (default) ipv4.dhcp-reject-servers: -- ipv4.auto-route-ext-gw: -1 (default) ipv6.method: auto ipv6.dns: -- ipv6.dns-search: -- ipv6.dns-options: -- ipv6.dns-priority: 0 ipv6.addresses: -- ipv6.gateway: -- ipv6.routes: -- ipv6.route-metric: -1 ipv6.route-table: 0 (unspec) ipv6.routing-rules: -- ipv6.replace-local-rule: -1 (default) ipv6.dhcp-send-release: -1 (default) ipv6.ignore-auto-routes: no ipv6.ignore-auto-dns: no ipv6.never-default: no ipv6.may-fail: yes ipv6.required-timeout: -1 (default) ipv6.ip6-privacy: -1 (default) ipv6.temp-valid-lifetime: 0 (default) ipv6.temp-preferred-lifetime: 0 (default) ipv6.addr-gen-mode: default ipv6.ra-timeout: 0 (default) ipv6.mtu: auto ipv6.dhcp-pd-hint: -- ipv6.dhcp-duid: -- ipv6.dhcp-iaid: -- ipv6.dhcp-timeout: 0 (default) ipv6.dhcp-send-hostname: yes ipv6.dhcp-hostname: -- ipv6.dhcp-hostname-flags: 0x0 (none) ipv6.auto-route-ext-gw: -1 (default) ipv6.token: -- bond.options: mode=active-backup,miimon=110 proxy.method: none proxy.browser-only: no proxy.pac-url: -- proxy.pac-script: -- GENERAL.NAME: bond0 GENERAL.UUID: 69e7ee46-007a-470e-9bdc-4928b4af57bb GENERAL.DEVICES: nm-bond GENERAL.IP-IFACE: nm-bond GENERAL.STATE: activated GENERAL.DEFAULT: no GENERAL.DEFAULT6: no GENERAL.SPEC-OBJECT: -- GENERAL.VPN: no GENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/22 GENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18 GENERAL.ZONE: -- GENERAL.MASTER-PATH: -- IP4.ADDRESS[1]: 192.0.2.50/24 IP4.GATEWAY: 192.0.2.1 IP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535 IP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535 IP4.DNS[1]: 192.0.2.1 DHCP4.OPTION[1]: broadcast_address = 192.0.2.255 DHCP4.OPTION[2]: dhcp_client_identifier = 01:b6:e9:1d:8b:19:45 DHCP4.OPTION[3]: dhcp_lease_time = 240 DHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1 DHCP4.OPTION[5]: domain_name_servers = 192.0.2.1 DHCP4.OPTION[6]: expiry = 1726867460 DHCP4.OPTION[7]: host_name = ip-10-31-12-57 DHCP4.OPTION[8]: ip_address = 192.0.2.50 DHCP4.OPTION[9]: next_server = 192.0.2.1 DHCP4.OPTION[10]: requested_broadcast_address = 1 DHCP4.OPTION[11]: requested_domain_name = 1 DHCP4.OPTION[12]: requested_domain_name_servers = 1 DHCP4.OPTION[13]: requested_domain_search = 1 DHCP4.OPTION[14]: requested_host_name = 1 DHCP4.OPTION[15]: requested_interface_mtu = 1 DHCP4.OPTION[16]: requested_ms_classless_static_routes = 1 DHCP4.OPTION[17]: requested_nis_domain = 1 DHCP4.OPTION[18]: requested_nis_servers = 1 DHCP4.OPTION[19]: requested_ntp_servers = 1 DHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1 DHCP4.OPTION[21]: requested_root_path = 1 DHCP4.OPTION[22]: requested_routers = 1 DHCP4.OPTION[23]: requested_static_routes = 1 DHCP4.OPTION[24]: requested_subnet_mask = 1 DHCP4.OPTION[25]: requested_time_offset = 1 DHCP4.OPTION[26]: requested_wpad = 1 DHCP4.OPTION[27]: routers = 192.0.2.1 DHCP4.OPTION[28]: subnet_mask = 255.255.255.0 IP6.ADDRESS[1]: 2001:db8::1a/128 IP6.ADDRESS[2]: 2001:db8::b4e9:1dff:fe8b:1945/64 IP6.ADDRESS[3]: fe80::b4e9:1dff:fe8b:1945/64 IP6.GATEWAY: fe80::70e7:75ff:fe50:d635 IP6.ROUTE[1]: dst = 2001:db8::1a/128, nh = ::, mt = 300 IP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300 IP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024 IP6.ROUTE[4]: dst = ::/0, nh = fe80::70e7:75ff:fe50:d635, mt = 300 IP6.DNS[1]: 2001:db8::1c00:40ff:fe76:1d49 IP6.DNS[2]: fe80::70e7:75ff:fe50:d635 DHCP6.OPTION[1]: dhcp6_client_id = 00:04:fa:0d:c3:7c:4c:9f:5f:6f:52:aa:57:2c:11:45:67:7d DHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::1c00:40ff:fe76:1d49 DHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-12-57 DHCP6.OPTION[4]: iaid = 8c:3b:13:c0 DHCP6.OPTION[5]: ip6_address = 2001:db8::1a 13131 1726867227.69135: no more pending results, returning what we have 13131 1726867227.69138: results queue empty 13131 1726867227.69140: checking for any_errors_fatal 13131 1726867227.69142: done checking for any_errors_fatal 13131 1726867227.69143: checking for max_fail_percentage 13131 1726867227.69145: done checking for max_fail_percentage 13131 1726867227.69146: checking to see if all hosts have failed and the running result is not ok 13131 1726867227.69146: done checking to see if all hosts have failed 13131 1726867227.69147: getting the remaining hosts for this loop 13131 1726867227.69148: done getting the remaining hosts for this loop 13131 1726867227.69152: getting the next task for host managed_node1 13131 1726867227.69158: done getting next task for host managed_node1 13131 1726867227.69161: ^ task is: TASK: Assert that the controller profile is activated 13131 1726867227.69163: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867227.69167: getting variables 13131 1726867227.69169: in VariableManager get_vars() 13131 1726867227.69231: Calling all_inventory to load vars for managed_node1 13131 1726867227.69235: Calling groups_inventory to load vars for managed_node1 13131 1726867227.69237: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867227.69284: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000160 13131 1726867227.69289: WORKER PROCESS EXITING 13131 1726867227.69304: Calling all_plugins_play to load vars for managed_node1 13131 1726867227.69308: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867227.69311: Calling groups_plugins_play to load vars for managed_node1 13131 1726867227.70954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867227.72618: done with get_vars() 13131 1726867227.72639: done getting variables 13131 1726867227.72708: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Friday 20 September 2024 17:20:27 -0400 (0:00:00.436) 0:00:42.837 ****** 13131 1726867227.72736: entering _queue_task() for managed_node1/assert 13131 1726867227.73062: worker is 1 (out of 1 available) 13131 1726867227.73073: exiting _queue_task() for managed_node1/assert 13131 1726867227.73222: done queuing things up, now waiting for results queue to drain 13131 1726867227.73224: waiting for pending results... 13131 1726867227.73386: running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated 13131 1726867227.73500: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000161 13131 1726867227.73521: variable 'ansible_search_path' from source: unknown 13131 1726867227.73574: calling self._execute() 13131 1726867227.73687: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.73701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.73764: variable 'omit' from source: magic vars 13131 1726867227.74133: variable 'ansible_distribution_major_version' from source: facts 13131 1726867227.74150: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867227.74276: variable 'network_provider' from source: set_fact 13131 1726867227.74292: Evaluated conditional (network_provider == "nm"): True 13131 1726867227.74308: variable 'omit' from source: magic vars 13131 1726867227.74340: variable 'omit' from source: magic vars 13131 1726867227.74482: variable 'controller_profile' from source: play vars 13131 1726867227.74485: variable 'omit' from source: magic vars 13131 1726867227.74516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867227.74567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867227.74595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867227.74617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867227.74682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867227.74686: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867227.74689: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.74696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.74807: Set connection var ansible_connection to ssh 13131 1726867227.74821: Set connection var ansible_timeout to 10 13131 1726867227.74828: Set connection var ansible_shell_type to sh 13131 1726867227.74841: Set connection var ansible_shell_executable to /bin/sh 13131 1726867227.74867: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867227.74964: Set connection var ansible_pipelining to False 13131 1726867227.74968: variable 'ansible_shell_executable' from source: unknown 13131 1726867227.74971: variable 'ansible_connection' from source: unknown 13131 1726867227.74973: variable 'ansible_module_compression' from source: unknown 13131 1726867227.74975: variable 'ansible_shell_type' from source: unknown 13131 1726867227.74980: variable 'ansible_shell_executable' from source: unknown 13131 1726867227.74982: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.74984: variable 'ansible_pipelining' from source: unknown 13131 1726867227.74986: variable 'ansible_timeout' from source: unknown 13131 1726867227.74988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.75095: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867227.75181: variable 'omit' from source: magic vars 13131 1726867227.75185: starting attempt loop 13131 1726867227.75188: running the handler 13131 1726867227.75320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867227.77618: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867227.77704: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867227.77784: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867227.77790: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867227.77831: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867227.77917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867227.77955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867227.78183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867227.78187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867227.78190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867227.78192: variable 'active_controller_profile' from source: set_fact 13131 1726867227.78194: Evaluated conditional (active_controller_profile.stdout | length != 0): True 13131 1726867227.78197: handler run complete 13131 1726867227.78213: attempt loop complete, returning result 13131 1726867227.78220: _execute() done 13131 1726867227.78228: dumping result to json 13131 1726867227.78234: done dumping result, returning 13131 1726867227.78245: done running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated [0affcac9-a3a5-5f24-9b7a-000000000161] 13131 1726867227.78253: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000161 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 13131 1726867227.78469: no more pending results, returning what we have 13131 1726867227.78473: results queue empty 13131 1726867227.78474: checking for any_errors_fatal 13131 1726867227.78483: done checking for any_errors_fatal 13131 1726867227.78484: checking for max_fail_percentage 13131 1726867227.78487: done checking for max_fail_percentage 13131 1726867227.78488: checking to see if all hosts have failed and the running result is not ok 13131 1726867227.78488: done checking to see if all hosts have failed 13131 1726867227.78489: getting the remaining hosts for this loop 13131 1726867227.78491: done getting the remaining hosts for this loop 13131 1726867227.78494: getting the next task for host managed_node1 13131 1726867227.78500: done getting next task for host managed_node1 13131 1726867227.78503: ^ task is: TASK: Get the controller device details 13131 1726867227.78505: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867227.78508: getting variables 13131 1726867227.78510: in VariableManager get_vars() 13131 1726867227.78622: Calling all_inventory to load vars for managed_node1 13131 1726867227.78755: Calling groups_inventory to load vars for managed_node1 13131 1726867227.78759: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867227.78769: Calling all_plugins_play to load vars for managed_node1 13131 1726867227.78772: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867227.78775: Calling groups_plugins_play to load vars for managed_node1 13131 1726867227.78642: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000161 13131 1726867227.79371: WORKER PROCESS EXITING 13131 1726867227.80188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867227.81165: done with get_vars() 13131 1726867227.81182: done getting variables 13131 1726867227.81223: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Friday 20 September 2024 17:20:27 -0400 (0:00:00.085) 0:00:42.923 ****** 13131 1726867227.81243: entering _queue_task() for managed_node1/command 13131 1726867227.81466: worker is 1 (out of 1 available) 13131 1726867227.81480: exiting _queue_task() for managed_node1/command 13131 1726867227.81491: done queuing things up, now waiting for results queue to drain 13131 1726867227.81492: waiting for pending results... 13131 1726867227.81661: running TaskExecutor() for managed_node1/TASK: Get the controller device details 13131 1726867227.81730: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000162 13131 1726867227.81744: variable 'ansible_search_path' from source: unknown 13131 1726867227.81771: calling self._execute() 13131 1726867227.81864: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.81881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.81918: variable 'omit' from source: magic vars 13131 1726867227.82281: variable 'ansible_distribution_major_version' from source: facts 13131 1726867227.82285: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867227.82356: variable 'network_provider' from source: set_fact 13131 1726867227.82360: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867227.82362: when evaluation is False, skipping this task 13131 1726867227.82365: _execute() done 13131 1726867227.82397: dumping result to json 13131 1726867227.82401: done dumping result, returning 13131 1726867227.82407: done running TaskExecutor() for managed_node1/TASK: Get the controller device details [0affcac9-a3a5-5f24-9b7a-000000000162] 13131 1726867227.82410: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000162 13131 1726867227.82471: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000162 13131 1726867227.82473: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867227.82543: no more pending results, returning what we have 13131 1726867227.82546: results queue empty 13131 1726867227.82547: checking for any_errors_fatal 13131 1726867227.82552: done checking for any_errors_fatal 13131 1726867227.82553: checking for max_fail_percentage 13131 1726867227.82555: done checking for max_fail_percentage 13131 1726867227.82555: checking to see if all hosts have failed and the running result is not ok 13131 1726867227.82556: done checking to see if all hosts have failed 13131 1726867227.82557: getting the remaining hosts for this loop 13131 1726867227.82558: done getting the remaining hosts for this loop 13131 1726867227.82561: getting the next task for host managed_node1 13131 1726867227.82566: done getting next task for host managed_node1 13131 1726867227.82568: ^ task is: TASK: Assert that the controller profile is activated 13131 1726867227.82571: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867227.82573: getting variables 13131 1726867227.82574: in VariableManager get_vars() 13131 1726867227.82618: Calling all_inventory to load vars for managed_node1 13131 1726867227.82621: Calling groups_inventory to load vars for managed_node1 13131 1726867227.82623: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867227.82632: Calling all_plugins_play to load vars for managed_node1 13131 1726867227.82634: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867227.82637: Calling groups_plugins_play to load vars for managed_node1 13131 1726867227.83673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867227.84557: done with get_vars() 13131 1726867227.84570: done getting variables 13131 1726867227.84613: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Friday 20 September 2024 17:20:27 -0400 (0:00:00.033) 0:00:42.956 ****** 13131 1726867227.84632: entering _queue_task() for managed_node1/assert 13131 1726867227.84821: worker is 1 (out of 1 available) 13131 1726867227.84834: exiting _queue_task() for managed_node1/assert 13131 1726867227.84843: done queuing things up, now waiting for results queue to drain 13131 1726867227.84844: waiting for pending results... 13131 1726867227.85264: running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated 13131 1726867227.85401: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000163 13131 1726867227.85405: variable 'ansible_search_path' from source: unknown 13131 1726867227.85408: calling self._execute() 13131 1726867227.85457: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.85469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.85485: variable 'omit' from source: magic vars 13131 1726867227.85869: variable 'ansible_distribution_major_version' from source: facts 13131 1726867227.85888: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867227.85970: variable 'network_provider' from source: set_fact 13131 1726867227.85975: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867227.85980: when evaluation is False, skipping this task 13131 1726867227.85983: _execute() done 13131 1726867227.85990: dumping result to json 13131 1726867227.85993: done dumping result, returning 13131 1726867227.85997: done running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated [0affcac9-a3a5-5f24-9b7a-000000000163] 13131 1726867227.86002: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000163 13131 1726867227.86089: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000163 13131 1726867227.86092: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867227.86143: no more pending results, returning what we have 13131 1726867227.86146: results queue empty 13131 1726867227.86147: checking for any_errors_fatal 13131 1726867227.86151: done checking for any_errors_fatal 13131 1726867227.86152: checking for max_fail_percentage 13131 1726867227.86154: done checking for max_fail_percentage 13131 1726867227.86155: checking to see if all hosts have failed and the running result is not ok 13131 1726867227.86155: done checking to see if all hosts have failed 13131 1726867227.86156: getting the remaining hosts for this loop 13131 1726867227.86157: done getting the remaining hosts for this loop 13131 1726867227.86160: getting the next task for host managed_node1 13131 1726867227.86172: done getting next task for host managed_node1 13131 1726867227.86180: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867227.86184: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867227.86202: getting variables 13131 1726867227.86204: in VariableManager get_vars() 13131 1726867227.86247: Calling all_inventory to load vars for managed_node1 13131 1726867227.86250: Calling groups_inventory to load vars for managed_node1 13131 1726867227.86252: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867227.86259: Calling all_plugins_play to load vars for managed_node1 13131 1726867227.86262: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867227.86264: Calling groups_plugins_play to load vars for managed_node1 13131 1726867227.87260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867227.88947: done with get_vars() 13131 1726867227.88965: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:20:27 -0400 (0:00:00.044) 0:00:43.001 ****** 13131 1726867227.89047: entering _queue_task() for managed_node1/include_tasks 13131 1726867227.89298: worker is 1 (out of 1 available) 13131 1726867227.89313: exiting _queue_task() for managed_node1/include_tasks 13131 1726867227.89325: done queuing things up, now waiting for results queue to drain 13131 1726867227.89327: waiting for pending results... 13131 1726867227.89571: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13131 1726867227.89685: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000016c 13131 1726867227.89695: variable 'ansible_search_path' from source: unknown 13131 1726867227.89698: variable 'ansible_search_path' from source: unknown 13131 1726867227.89731: calling self._execute() 13131 1726867227.89809: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.89813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.89823: variable 'omit' from source: magic vars 13131 1726867227.90105: variable 'ansible_distribution_major_version' from source: facts 13131 1726867227.90116: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867227.90121: _execute() done 13131 1726867227.90124: dumping result to json 13131 1726867227.90126: done dumping result, returning 13131 1726867227.90137: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-5f24-9b7a-00000000016c] 13131 1726867227.90140: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000016c 13131 1726867227.90336: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000016c 13131 1726867227.90405: no more pending results, returning what we have 13131 1726867227.90409: in VariableManager get_vars() 13131 1726867227.90455: Calling all_inventory to load vars for managed_node1 13131 1726867227.90459: Calling groups_inventory to load vars for managed_node1 13131 1726867227.90461: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867227.90470: Calling all_plugins_play to load vars for managed_node1 13131 1726867227.90472: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867227.90476: Calling groups_plugins_play to load vars for managed_node1 13131 1726867227.91006: WORKER PROCESS EXITING 13131 1726867227.91742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867227.93092: done with get_vars() 13131 1726867227.93107: variable 'ansible_search_path' from source: unknown 13131 1726867227.93108: variable 'ansible_search_path' from source: unknown 13131 1726867227.93132: we have included files to process 13131 1726867227.93133: generating all_blocks data 13131 1726867227.93134: done generating all_blocks data 13131 1726867227.93138: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867227.93139: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867227.93140: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13131 1726867227.93521: done processing included file 13131 1726867227.93523: iterating over new_blocks loaded from include file 13131 1726867227.93524: in VariableManager get_vars() 13131 1726867227.93544: done with get_vars() 13131 1726867227.93545: filtering new block on tags 13131 1726867227.93569: done filtering new block on tags 13131 1726867227.93571: in VariableManager get_vars() 13131 1726867227.93592: done with get_vars() 13131 1726867227.93593: filtering new block on tags 13131 1726867227.93623: done filtering new block on tags 13131 1726867227.93625: in VariableManager get_vars() 13131 1726867227.93642: done with get_vars() 13131 1726867227.93643: filtering new block on tags 13131 1726867227.93665: done filtering new block on tags 13131 1726867227.93666: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 13131 1726867227.93670: extending task lists for all hosts with included blocks 13131 1726867227.94265: done extending task lists 13131 1726867227.94266: done processing included files 13131 1726867227.94267: results queue empty 13131 1726867227.94267: checking for any_errors_fatal 13131 1726867227.94269: done checking for any_errors_fatal 13131 1726867227.94270: checking for max_fail_percentage 13131 1726867227.94271: done checking for max_fail_percentage 13131 1726867227.94271: checking to see if all hosts have failed and the running result is not ok 13131 1726867227.94272: done checking to see if all hosts have failed 13131 1726867227.94272: getting the remaining hosts for this loop 13131 1726867227.94273: done getting the remaining hosts for this loop 13131 1726867227.94274: getting the next task for host managed_node1 13131 1726867227.94279: done getting next task for host managed_node1 13131 1726867227.94281: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867227.94283: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867227.94290: getting variables 13131 1726867227.94291: in VariableManager get_vars() 13131 1726867227.94305: Calling all_inventory to load vars for managed_node1 13131 1726867227.94307: Calling groups_inventory to load vars for managed_node1 13131 1726867227.94308: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867227.94311: Calling all_plugins_play to load vars for managed_node1 13131 1726867227.94313: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867227.94314: Calling groups_plugins_play to load vars for managed_node1 13131 1726867227.95340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867227.96776: done with get_vars() 13131 1726867227.96796: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:20:27 -0400 (0:00:00.078) 0:00:43.079 ****** 13131 1726867227.96862: entering _queue_task() for managed_node1/setup 13131 1726867227.97158: worker is 1 (out of 1 available) 13131 1726867227.97169: exiting _queue_task() for managed_node1/setup 13131 1726867227.97383: done queuing things up, now waiting for results queue to drain 13131 1726867227.97385: waiting for pending results... 13131 1726867227.97496: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13131 1726867227.97607: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000914 13131 1726867227.97626: variable 'ansible_search_path' from source: unknown 13131 1726867227.97631: variable 'ansible_search_path' from source: unknown 13131 1726867227.97661: calling self._execute() 13131 1726867227.97735: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867227.97741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867227.97749: variable 'omit' from source: magic vars 13131 1726867227.98226: variable 'ansible_distribution_major_version' from source: facts 13131 1726867227.98230: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867227.98458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867228.00860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867228.00940: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867228.00971: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867228.01053: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867228.01056: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867228.01201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867228.01205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867228.01208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867228.01243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867228.01258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867228.01339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867228.01356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867228.01397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867228.01451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867228.01463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867228.01648: variable '__network_required_facts' from source: role '' defaults 13131 1726867228.01662: variable 'ansible_facts' from source: unknown 13131 1726867228.02388: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13131 1726867228.02393: when evaluation is False, skipping this task 13131 1726867228.02395: _execute() done 13131 1726867228.02398: dumping result to json 13131 1726867228.02400: done dumping result, returning 13131 1726867228.02408: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-5f24-9b7a-000000000914] 13131 1726867228.02410: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000914 13131 1726867228.02500: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000914 13131 1726867228.02505: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867228.02549: no more pending results, returning what we have 13131 1726867228.02552: results queue empty 13131 1726867228.02553: checking for any_errors_fatal 13131 1726867228.02555: done checking for any_errors_fatal 13131 1726867228.02555: checking for max_fail_percentage 13131 1726867228.02557: done checking for max_fail_percentage 13131 1726867228.02558: checking to see if all hosts have failed and the running result is not ok 13131 1726867228.02558: done checking to see if all hosts have failed 13131 1726867228.02559: getting the remaining hosts for this loop 13131 1726867228.02560: done getting the remaining hosts for this loop 13131 1726867228.02563: getting the next task for host managed_node1 13131 1726867228.02572: done getting next task for host managed_node1 13131 1726867228.02576: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867228.02584: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867228.02607: getting variables 13131 1726867228.02609: in VariableManager get_vars() 13131 1726867228.02659: Calling all_inventory to load vars for managed_node1 13131 1726867228.02662: Calling groups_inventory to load vars for managed_node1 13131 1726867228.02664: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867228.02673: Calling all_plugins_play to load vars for managed_node1 13131 1726867228.02675: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867228.02709: Calling groups_plugins_play to load vars for managed_node1 13131 1726867228.03670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867228.05682: done with get_vars() 13131 1726867228.05705: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:20:28 -0400 (0:00:00.089) 0:00:43.168 ****** 13131 1726867228.05811: entering _queue_task() for managed_node1/stat 13131 1726867228.06359: worker is 1 (out of 1 available) 13131 1726867228.06375: exiting _queue_task() for managed_node1/stat 13131 1726867228.06586: done queuing things up, now waiting for results queue to drain 13131 1726867228.06587: waiting for pending results... 13131 1726867228.07194: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 13131 1726867228.07411: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000916 13131 1726867228.07432: variable 'ansible_search_path' from source: unknown 13131 1726867228.07527: variable 'ansible_search_path' from source: unknown 13131 1726867228.07623: calling self._execute() 13131 1726867228.07783: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867228.07790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867228.07801: variable 'omit' from source: magic vars 13131 1726867228.08664: variable 'ansible_distribution_major_version' from source: facts 13131 1726867228.08675: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867228.08902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867228.09180: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867228.09230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867228.09263: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867228.09296: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867228.09406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867228.09442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867228.09468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867228.09495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867228.09584: variable '__network_is_ostree' from source: set_fact 13131 1726867228.09782: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867228.09785: when evaluation is False, skipping this task 13131 1726867228.09787: _execute() done 13131 1726867228.09788: dumping result to json 13131 1726867228.09790: done dumping result, returning 13131 1726867228.09792: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-5f24-9b7a-000000000916] 13131 1726867228.09794: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000916 13131 1726867228.09852: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000916 13131 1726867228.09855: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867228.09921: no more pending results, returning what we have 13131 1726867228.09927: results queue empty 13131 1726867228.09928: checking for any_errors_fatal 13131 1726867228.09933: done checking for any_errors_fatal 13131 1726867228.09934: checking for max_fail_percentage 13131 1726867228.09935: done checking for max_fail_percentage 13131 1726867228.09936: checking to see if all hosts have failed and the running result is not ok 13131 1726867228.09937: done checking to see if all hosts have failed 13131 1726867228.09938: getting the remaining hosts for this loop 13131 1726867228.09939: done getting the remaining hosts for this loop 13131 1726867228.09942: getting the next task for host managed_node1 13131 1726867228.09951: done getting next task for host managed_node1 13131 1726867228.09955: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867228.09960: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867228.09984: getting variables 13131 1726867228.09985: in VariableManager get_vars() 13131 1726867228.10035: Calling all_inventory to load vars for managed_node1 13131 1726867228.10038: Calling groups_inventory to load vars for managed_node1 13131 1726867228.10040: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867228.10049: Calling all_plugins_play to load vars for managed_node1 13131 1726867228.10051: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867228.10054: Calling groups_plugins_play to load vars for managed_node1 13131 1726867228.12073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867228.17389: done with get_vars() 13131 1726867228.17487: done getting variables 13131 1726867228.17590: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:20:28 -0400 (0:00:00.118) 0:00:43.286 ****** 13131 1726867228.17630: entering _queue_task() for managed_node1/set_fact 13131 1726867228.18814: worker is 1 (out of 1 available) 13131 1726867228.18827: exiting _queue_task() for managed_node1/set_fact 13131 1726867228.18841: done queuing things up, now waiting for results queue to drain 13131 1726867228.18842: waiting for pending results... 13131 1726867228.19610: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13131 1726867228.19990: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000917 13131 1726867228.20009: variable 'ansible_search_path' from source: unknown 13131 1726867228.20012: variable 'ansible_search_path' from source: unknown 13131 1726867228.20383: calling self._execute() 13131 1726867228.20518: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867228.20526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867228.20536: variable 'omit' from source: magic vars 13131 1726867228.21445: variable 'ansible_distribution_major_version' from source: facts 13131 1726867228.21457: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867228.21876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867228.22186: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867228.22251: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867228.22327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867228.22364: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867228.22467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867228.22495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867228.22580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867228.22585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867228.22656: variable '__network_is_ostree' from source: set_fact 13131 1726867228.22756: Evaluated conditional (not __network_is_ostree is defined): False 13131 1726867228.22758: when evaluation is False, skipping this task 13131 1726867228.22761: _execute() done 13131 1726867228.22762: dumping result to json 13131 1726867228.22764: done dumping result, returning 13131 1726867228.22767: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-5f24-9b7a-000000000917] 13131 1726867228.22768: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000917 13131 1726867228.22842: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000917 13131 1726867228.22846: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13131 1726867228.22900: no more pending results, returning what we have 13131 1726867228.22906: results queue empty 13131 1726867228.22907: checking for any_errors_fatal 13131 1726867228.22920: done checking for any_errors_fatal 13131 1726867228.22921: checking for max_fail_percentage 13131 1726867228.22924: done checking for max_fail_percentage 13131 1726867228.22925: checking to see if all hosts have failed and the running result is not ok 13131 1726867228.22926: done checking to see if all hosts have failed 13131 1726867228.22926: getting the remaining hosts for this loop 13131 1726867228.22928: done getting the remaining hosts for this loop 13131 1726867228.22932: getting the next task for host managed_node1 13131 1726867228.22941: done getting next task for host managed_node1 13131 1726867228.22945: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867228.22952: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867228.22983: getting variables 13131 1726867228.22990: in VariableManager get_vars() 13131 1726867228.23060: Calling all_inventory to load vars for managed_node1 13131 1726867228.23313: Calling groups_inventory to load vars for managed_node1 13131 1726867228.23321: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867228.23331: Calling all_plugins_play to load vars for managed_node1 13131 1726867228.23334: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867228.23336: Calling groups_plugins_play to load vars for managed_node1 13131 1726867228.26591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867228.29063: done with get_vars() 13131 1726867228.29110: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:20:28 -0400 (0:00:00.116) 0:00:43.403 ****** 13131 1726867228.29263: entering _queue_task() for managed_node1/service_facts 13131 1726867228.29922: worker is 1 (out of 1 available) 13131 1726867228.29931: exiting _queue_task() for managed_node1/service_facts 13131 1726867228.29944: done queuing things up, now waiting for results queue to drain 13131 1726867228.29946: waiting for pending results... 13131 1726867228.30160: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 13131 1726867228.30725: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000919 13131 1726867228.30731: variable 'ansible_search_path' from source: unknown 13131 1726867228.30735: variable 'ansible_search_path' from source: unknown 13131 1726867228.30739: calling self._execute() 13131 1726867228.30885: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867228.30903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867228.30916: variable 'omit' from source: magic vars 13131 1726867228.31450: variable 'ansible_distribution_major_version' from source: facts 13131 1726867228.31461: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867228.31467: variable 'omit' from source: magic vars 13131 1726867228.31548: variable 'omit' from source: magic vars 13131 1726867228.31589: variable 'omit' from source: magic vars 13131 1726867228.31628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867228.31670: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867228.31690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867228.31708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867228.31721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867228.31756: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867228.31759: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867228.31762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867228.31868: Set connection var ansible_connection to ssh 13131 1726867228.31876: Set connection var ansible_timeout to 10 13131 1726867228.31886: Set connection var ansible_shell_type to sh 13131 1726867228.31896: Set connection var ansible_shell_executable to /bin/sh 13131 1726867228.31907: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867228.31912: Set connection var ansible_pipelining to False 13131 1726867228.31937: variable 'ansible_shell_executable' from source: unknown 13131 1726867228.31941: variable 'ansible_connection' from source: unknown 13131 1726867228.31944: variable 'ansible_module_compression' from source: unknown 13131 1726867228.31946: variable 'ansible_shell_type' from source: unknown 13131 1726867228.31948: variable 'ansible_shell_executable' from source: unknown 13131 1726867228.31950: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867228.31959: variable 'ansible_pipelining' from source: unknown 13131 1726867228.31962: variable 'ansible_timeout' from source: unknown 13131 1726867228.31966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867228.32164: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867228.32180: variable 'omit' from source: magic vars 13131 1726867228.32185: starting attempt loop 13131 1726867228.32188: running the handler 13131 1726867228.32205: _low_level_execute_command(): starting 13131 1726867228.32216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867228.32991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867228.33006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867228.33069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867228.33098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867228.33119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867228.33137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867228.33209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867228.35023: stdout chunk (state=3): >>>/root <<< 13131 1726867228.35100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867228.35108: stdout chunk (state=3): >>><<< 13131 1726867228.35111: stderr chunk (state=3): >>><<< 13131 1726867228.35401: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867228.35409: _low_level_execute_command(): starting 13131 1726867228.35412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960 `" && echo ansible-tmp-1726867228.3529906-15186-95517342801960="` echo /root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960 `" ) && sleep 0' 13131 1726867228.36610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867228.36667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867228.36787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867228.38698: stdout chunk (state=3): >>>ansible-tmp-1726867228.3529906-15186-95517342801960=/root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960 <<< 13131 1726867228.38834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867228.38851: stdout chunk (state=3): >>><<< 13131 1726867228.38863: stderr chunk (state=3): >>><<< 13131 1726867228.38883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867228.3529906-15186-95517342801960=/root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867228.38938: variable 'ansible_module_compression' from source: unknown 13131 1726867228.38988: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13131 1726867228.39034: variable 'ansible_facts' from source: unknown 13131 1726867228.39223: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/AnsiballZ_service_facts.py 13131 1726867228.39352: Sending initial data 13131 1726867228.39361: Sent initial data (161 bytes) 13131 1726867228.39853: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867228.39960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867228.39983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867228.39998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867228.40202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867228.41598: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13131 1726867228.41608: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13131 1726867228.41611: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13131 1726867228.41619: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 13131 1726867228.41626: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 13131 1726867228.41633: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 13131 1726867228.41649: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867228.41701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867228.41826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp3we5ka66 /root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/AnsiballZ_service_facts.py <<< 13131 1726867228.41829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/AnsiballZ_service_facts.py" <<< 13131 1726867228.41867: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp3we5ka66" to remote "/root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/AnsiballZ_service_facts.py" <<< 13131 1726867228.42849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867228.42982: stderr chunk (state=3): >>><<< 13131 1726867228.42986: stdout chunk (state=3): >>><<< 13131 1726867228.42988: done transferring module to remote 13131 1726867228.42991: _low_level_execute_command(): starting 13131 1726867228.42993: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/ /root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/AnsiballZ_service_facts.py && sleep 0' 13131 1726867228.43539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867228.43558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867228.43590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867228.43613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867228.43671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867228.43722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867228.43744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867228.43772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867228.43847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867228.45632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867228.45635: stdout chunk (state=3): >>><<< 13131 1726867228.45638: stderr chunk (state=3): >>><<< 13131 1726867228.45729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867228.45732: _low_level_execute_command(): starting 13131 1726867228.45735: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/AnsiballZ_service_facts.py && sleep 0' 13131 1726867228.46300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867228.46306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867228.46336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867228.46411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867229.99925: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13131 1726867230.01560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867230.01564: stdout chunk (state=3): >>><<< 13131 1726867230.01782: stderr chunk (state=3): >>><<< 13131 1726867230.01788: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867230.03736: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867230.03996: _low_level_execute_command(): starting 13131 1726867230.04010: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867228.3529906-15186-95517342801960/ > /dev/null 2>&1 && sleep 0' 13131 1726867230.05232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867230.05235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867230.05237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867230.05239: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867230.05241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867230.05592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867230.05796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867230.06248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867230.07910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867230.08303: stderr chunk (state=3): >>><<< 13131 1726867230.08306: stdout chunk (state=3): >>><<< 13131 1726867230.08482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867230.08486: handler run complete 13131 1726867230.09081: variable 'ansible_facts' from source: unknown 13131 1726867230.09682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867230.10810: variable 'ansible_facts' from source: unknown 13131 1726867230.11145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867230.11749: attempt loop complete, returning result 13131 1726867230.11760: _execute() done 13131 1726867230.11769: dumping result to json 13131 1726867230.11837: done dumping result, returning 13131 1726867230.12093: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-5f24-9b7a-000000000919] 13131 1726867230.12482: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000919 13131 1726867230.13987: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000919 13131 1726867230.13991: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867230.14118: no more pending results, returning what we have 13131 1726867230.14122: results queue empty 13131 1726867230.14123: checking for any_errors_fatal 13131 1726867230.14128: done checking for any_errors_fatal 13131 1726867230.14131: checking for max_fail_percentage 13131 1726867230.14134: done checking for max_fail_percentage 13131 1726867230.14135: checking to see if all hosts have failed and the running result is not ok 13131 1726867230.14135: done checking to see if all hosts have failed 13131 1726867230.14136: getting the remaining hosts for this loop 13131 1726867230.14137: done getting the remaining hosts for this loop 13131 1726867230.14141: getting the next task for host managed_node1 13131 1726867230.14151: done getting next task for host managed_node1 13131 1726867230.14155: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867230.14161: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867230.14174: getting variables 13131 1726867230.14176: in VariableManager get_vars() 13131 1726867230.14685: Calling all_inventory to load vars for managed_node1 13131 1726867230.14689: Calling groups_inventory to load vars for managed_node1 13131 1726867230.14691: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867230.14700: Calling all_plugins_play to load vars for managed_node1 13131 1726867230.14705: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867230.14708: Calling groups_plugins_play to load vars for managed_node1 13131 1726867230.18238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867230.22442: done with get_vars() 13131 1726867230.22469: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:20:30 -0400 (0:00:01.934) 0:00:45.337 ****** 13131 1726867230.22887: entering _queue_task() for managed_node1/package_facts 13131 1726867230.24221: worker is 1 (out of 1 available) 13131 1726867230.24233: exiting _queue_task() for managed_node1/package_facts 13131 1726867230.24246: done queuing things up, now waiting for results queue to drain 13131 1726867230.24248: waiting for pending results... 13131 1726867230.24829: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 13131 1726867230.25096: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000091a 13131 1726867230.25354: variable 'ansible_search_path' from source: unknown 13131 1726867230.25358: variable 'ansible_search_path' from source: unknown 13131 1726867230.25361: calling self._execute() 13131 1726867230.25892: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867230.25900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867230.25913: variable 'omit' from source: magic vars 13131 1726867230.27200: variable 'ansible_distribution_major_version' from source: facts 13131 1726867230.27204: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867230.27208: variable 'omit' from source: magic vars 13131 1726867230.27331: variable 'omit' from source: magic vars 13131 1726867230.27490: variable 'omit' from source: magic vars 13131 1726867230.27704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867230.27744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867230.27762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867230.27782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867230.27917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867230.27969: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867230.27973: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867230.27976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867230.28164: Set connection var ansible_connection to ssh 13131 1726867230.28173: Set connection var ansible_timeout to 10 13131 1726867230.28178: Set connection var ansible_shell_type to sh 13131 1726867230.28233: Set connection var ansible_shell_executable to /bin/sh 13131 1726867230.28243: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867230.28249: Set connection var ansible_pipelining to False 13131 1726867230.28293: variable 'ansible_shell_executable' from source: unknown 13131 1726867230.28297: variable 'ansible_connection' from source: unknown 13131 1726867230.28300: variable 'ansible_module_compression' from source: unknown 13131 1726867230.28302: variable 'ansible_shell_type' from source: unknown 13131 1726867230.28304: variable 'ansible_shell_executable' from source: unknown 13131 1726867230.28307: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867230.28309: variable 'ansible_pipelining' from source: unknown 13131 1726867230.28311: variable 'ansible_timeout' from source: unknown 13131 1726867230.28314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867230.28949: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867230.28954: variable 'omit' from source: magic vars 13131 1726867230.28956: starting attempt loop 13131 1726867230.28959: running the handler 13131 1726867230.28961: _low_level_execute_command(): starting 13131 1726867230.29165: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867230.30873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867230.30929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867230.31123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867230.31132: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867230.31140: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867230.31146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867230.31236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867230.31241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867230.31392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867230.31474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867230.33160: stdout chunk (state=3): >>>/root <<< 13131 1726867230.33483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867230.33486: stdout chunk (state=3): >>><<< 13131 1726867230.33489: stderr chunk (state=3): >>><<< 13131 1726867230.33493: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867230.33495: _low_level_execute_command(): starting 13131 1726867230.33498: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657 `" && echo ansible-tmp-1726867230.3331738-15278-198042048742657="` echo /root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657 `" ) && sleep 0' 13131 1726867230.34665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867230.34669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867230.34672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867230.34867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867230.34890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867230.34894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867230.34896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867230.34991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867230.36869: stdout chunk (state=3): >>>ansible-tmp-1726867230.3331738-15278-198042048742657=/root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657 <<< 13131 1726867230.36989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867230.37097: stderr chunk (state=3): >>><<< 13131 1726867230.37100: stdout chunk (state=3): >>><<< 13131 1726867230.37170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867230.3331738-15278-198042048742657=/root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867230.37245: variable 'ansible_module_compression' from source: unknown 13131 1726867230.37303: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13131 1726867230.37775: variable 'ansible_facts' from source: unknown 13131 1726867230.38141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/AnsiballZ_package_facts.py 13131 1726867230.38894: Sending initial data 13131 1726867230.38897: Sent initial data (162 bytes) 13131 1726867230.40175: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867230.40191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867230.40211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867230.40219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867230.40328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867230.40366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867230.40454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867230.40485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867230.40533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867230.42201: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867230.42205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867230.42208: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpzizbvglm /root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/AnsiballZ_package_facts.py <<< 13131 1726867230.42210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/AnsiballZ_package_facts.py" <<< 13131 1726867230.42383: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpzizbvglm" to remote "/root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/AnsiballZ_package_facts.py" <<< 13131 1726867230.45384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867230.45388: stdout chunk (state=3): >>><<< 13131 1726867230.45391: stderr chunk (state=3): >>><<< 13131 1726867230.45393: done transferring module to remote 13131 1726867230.45486: _low_level_execute_command(): starting 13131 1726867230.45490: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/ /root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/AnsiballZ_package_facts.py && sleep 0' 13131 1726867230.46761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867230.46920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867230.46978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867230.46997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867230.47070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867230.48984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867230.48987: stdout chunk (state=3): >>><<< 13131 1726867230.48990: stderr chunk (state=3): >>><<< 13131 1726867230.48992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867230.48995: _low_level_execute_command(): starting 13131 1726867230.48998: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/AnsiballZ_package_facts.py && sleep 0' 13131 1726867230.50283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867230.50340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867230.50372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867230.50483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867230.50546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867230.50619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867230.50638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867230.50886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867230.94485: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 13131 1726867230.94575: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 13131 1726867230.94722: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 13131 1726867230.94810: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 13131 1726867230.94817: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13131 1726867230.97258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867230.97262: stdout chunk (state=3): >>><<< 13131 1726867230.97264: stderr chunk (state=3): >>><<< 13131 1726867230.97702: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867231.05561: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867231.05581: _low_level_execute_command(): starting 13131 1726867231.05691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867230.3331738-15278-198042048742657/ > /dev/null 2>&1 && sleep 0' 13131 1726867231.06862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867231.06866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867231.06984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867231.07109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867231.07164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867231.09022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867231.09126: stderr chunk (state=3): >>><<< 13131 1726867231.09130: stdout chunk (state=3): >>><<< 13131 1726867231.09147: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867231.09160: handler run complete 13131 1726867231.10664: variable 'ansible_facts' from source: unknown 13131 1726867231.11309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867231.13759: variable 'ansible_facts' from source: unknown 13131 1726867231.14302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867231.15205: attempt loop complete, returning result 13131 1726867231.15209: _execute() done 13131 1726867231.15211: dumping result to json 13131 1726867231.15327: done dumping result, returning 13131 1726867231.15337: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-5f24-9b7a-00000000091a] 13131 1726867231.15340: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000091a 13131 1726867231.18892: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000091a 13131 1726867231.18896: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867231.19050: no more pending results, returning what we have 13131 1726867231.19053: results queue empty 13131 1726867231.19054: checking for any_errors_fatal 13131 1726867231.19058: done checking for any_errors_fatal 13131 1726867231.19059: checking for max_fail_percentage 13131 1726867231.19060: done checking for max_fail_percentage 13131 1726867231.19061: checking to see if all hosts have failed and the running result is not ok 13131 1726867231.19062: done checking to see if all hosts have failed 13131 1726867231.19063: getting the remaining hosts for this loop 13131 1726867231.19064: done getting the remaining hosts for this loop 13131 1726867231.19067: getting the next task for host managed_node1 13131 1726867231.19073: done getting next task for host managed_node1 13131 1726867231.19076: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867231.19082: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867231.19094: getting variables 13131 1726867231.19095: in VariableManager get_vars() 13131 1726867231.19138: Calling all_inventory to load vars for managed_node1 13131 1726867231.19141: Calling groups_inventory to load vars for managed_node1 13131 1726867231.19143: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867231.19152: Calling all_plugins_play to load vars for managed_node1 13131 1726867231.19155: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867231.19158: Calling groups_plugins_play to load vars for managed_node1 13131 1726867231.22694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867231.25557: done with get_vars() 13131 1726867231.25810: done getting variables 13131 1726867231.25869: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:20:31 -0400 (0:00:01.031) 0:00:46.370 ****** 13131 1726867231.26009: entering _queue_task() for managed_node1/debug 13131 1726867231.27133: worker is 1 (out of 1 available) 13131 1726867231.27144: exiting _queue_task() for managed_node1/debug 13131 1726867231.27155: done queuing things up, now waiting for results queue to drain 13131 1726867231.27156: waiting for pending results... 13131 1726867231.27525: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13131 1726867231.27770: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000016d 13131 1726867231.27774: variable 'ansible_search_path' from source: unknown 13131 1726867231.27779: variable 'ansible_search_path' from source: unknown 13131 1726867231.27783: calling self._execute() 13131 1726867231.27889: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867231.27912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867231.27924: variable 'omit' from source: magic vars 13131 1726867231.28440: variable 'ansible_distribution_major_version' from source: facts 13131 1726867231.28459: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867231.28469: variable 'omit' from source: magic vars 13131 1726867231.28544: variable 'omit' from source: magic vars 13131 1726867231.28653: variable 'network_provider' from source: set_fact 13131 1726867231.28745: variable 'omit' from source: magic vars 13131 1726867231.28748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867231.28759: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867231.28784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867231.28805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867231.28822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867231.28861: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867231.28870: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867231.28883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867231.28988: Set connection var ansible_connection to ssh 13131 1726867231.29002: Set connection var ansible_timeout to 10 13131 1726867231.29009: Set connection var ansible_shell_type to sh 13131 1726867231.29020: Set connection var ansible_shell_executable to /bin/sh 13131 1726867231.29030: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867231.29039: Set connection var ansible_pipelining to False 13131 1726867231.29073: variable 'ansible_shell_executable' from source: unknown 13131 1726867231.29076: variable 'ansible_connection' from source: unknown 13131 1726867231.29102: variable 'ansible_module_compression' from source: unknown 13131 1726867231.29105: variable 'ansible_shell_type' from source: unknown 13131 1726867231.29107: variable 'ansible_shell_executable' from source: unknown 13131 1726867231.29109: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867231.29111: variable 'ansible_pipelining' from source: unknown 13131 1726867231.29113: variable 'ansible_timeout' from source: unknown 13131 1726867231.29184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867231.29265: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867231.29287: variable 'omit' from source: magic vars 13131 1726867231.29301: starting attempt loop 13131 1726867231.29318: running the handler 13131 1726867231.29404: handler run complete 13131 1726867231.29407: attempt loop complete, returning result 13131 1726867231.29409: _execute() done 13131 1726867231.29411: dumping result to json 13131 1726867231.29413: done dumping result, returning 13131 1726867231.29482: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-5f24-9b7a-00000000016d] 13131 1726867231.29486: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000016d 13131 1726867231.29797: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000016d 13131 1726867231.29801: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 13131 1726867231.29869: no more pending results, returning what we have 13131 1726867231.29872: results queue empty 13131 1726867231.29873: checking for any_errors_fatal 13131 1726867231.29881: done checking for any_errors_fatal 13131 1726867231.29882: checking for max_fail_percentage 13131 1726867231.29883: done checking for max_fail_percentage 13131 1726867231.29884: checking to see if all hosts have failed and the running result is not ok 13131 1726867231.29885: done checking to see if all hosts have failed 13131 1726867231.29886: getting the remaining hosts for this loop 13131 1726867231.29887: done getting the remaining hosts for this loop 13131 1726867231.29890: getting the next task for host managed_node1 13131 1726867231.29896: done getting next task for host managed_node1 13131 1726867231.29899: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867231.29903: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867231.29917: getting variables 13131 1726867231.29919: in VariableManager get_vars() 13131 1726867231.29963: Calling all_inventory to load vars for managed_node1 13131 1726867231.29966: Calling groups_inventory to load vars for managed_node1 13131 1726867231.29969: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867231.30095: Calling all_plugins_play to load vars for managed_node1 13131 1726867231.30100: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867231.30104: Calling groups_plugins_play to load vars for managed_node1 13131 1726867231.33522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867231.39010: done with get_vars() 13131 1726867231.39210: done getting variables 13131 1726867231.39275: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:20:31 -0400 (0:00:00.133) 0:00:46.504 ****** 13131 1726867231.39398: entering _queue_task() for managed_node1/fail 13131 1726867231.40622: worker is 1 (out of 1 available) 13131 1726867231.40636: exiting _queue_task() for managed_node1/fail 13131 1726867231.40648: done queuing things up, now waiting for results queue to drain 13131 1726867231.40649: waiting for pending results... 13131 1726867231.41098: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13131 1726867231.41336: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000016e 13131 1726867231.41410: variable 'ansible_search_path' from source: unknown 13131 1726867231.41414: variable 'ansible_search_path' from source: unknown 13131 1726867231.41451: calling self._execute() 13131 1726867231.41885: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867231.41889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867231.41891: variable 'omit' from source: magic vars 13131 1726867231.42949: variable 'ansible_distribution_major_version' from source: facts 13131 1726867231.43010: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867231.43528: variable 'network_state' from source: role '' defaults 13131 1726867231.43637: Evaluated conditional (network_state != {}): False 13131 1726867231.43680: when evaluation is False, skipping this task 13131 1726867231.43683: _execute() done 13131 1726867231.43688: dumping result to json 13131 1726867231.43691: done dumping result, returning 13131 1726867231.43696: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-5f24-9b7a-00000000016e] 13131 1726867231.43776: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000016e 13131 1726867231.43980: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000016e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867231.44044: no more pending results, returning what we have 13131 1726867231.44049: results queue empty 13131 1726867231.44050: checking for any_errors_fatal 13131 1726867231.44059: done checking for any_errors_fatal 13131 1726867231.44060: checking for max_fail_percentage 13131 1726867231.44062: done checking for max_fail_percentage 13131 1726867231.44063: checking to see if all hosts have failed and the running result is not ok 13131 1726867231.44063: done checking to see if all hosts have failed 13131 1726867231.44064: getting the remaining hosts for this loop 13131 1726867231.44065: done getting the remaining hosts for this loop 13131 1726867231.44070: getting the next task for host managed_node1 13131 1726867231.44079: done getting next task for host managed_node1 13131 1726867231.44085: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867231.44090: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867231.44118: getting variables 13131 1726867231.44120: in VariableManager get_vars() 13131 1726867231.44499: Calling all_inventory to load vars for managed_node1 13131 1726867231.44502: Calling groups_inventory to load vars for managed_node1 13131 1726867231.44506: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867231.44519: Calling all_plugins_play to load vars for managed_node1 13131 1726867231.44522: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867231.44524: Calling groups_plugins_play to load vars for managed_node1 13131 1726867231.45081: WORKER PROCESS EXITING 13131 1726867231.49362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867231.55087: done with get_vars() 13131 1726867231.55190: done getting variables 13131 1726867231.55499: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:20:31 -0400 (0:00:00.161) 0:00:46.665 ****** 13131 1726867231.55539: entering _queue_task() for managed_node1/fail 13131 1726867231.56741: worker is 1 (out of 1 available) 13131 1726867231.56756: exiting _queue_task() for managed_node1/fail 13131 1726867231.56768: done queuing things up, now waiting for results queue to drain 13131 1726867231.56770: waiting for pending results... 13131 1726867231.57715: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13131 1726867231.58208: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000016f 13131 1726867231.58313: variable 'ansible_search_path' from source: unknown 13131 1726867231.58318: variable 'ansible_search_path' from source: unknown 13131 1726867231.58449: calling self._execute() 13131 1726867231.58706: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867231.58748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867231.58758: variable 'omit' from source: magic vars 13131 1726867231.59723: variable 'ansible_distribution_major_version' from source: facts 13131 1726867231.59786: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867231.60088: variable 'network_state' from source: role '' defaults 13131 1726867231.60156: Evaluated conditional (network_state != {}): False 13131 1726867231.60159: when evaluation is False, skipping this task 13131 1726867231.60161: _execute() done 13131 1726867231.60163: dumping result to json 13131 1726867231.60165: done dumping result, returning 13131 1726867231.60168: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-5f24-9b7a-00000000016f] 13131 1726867231.60170: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000016f 13131 1726867231.60442: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000016f 13131 1726867231.60445: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867231.60527: no more pending results, returning what we have 13131 1726867231.60532: results queue empty 13131 1726867231.60533: checking for any_errors_fatal 13131 1726867231.60544: done checking for any_errors_fatal 13131 1726867231.60545: checking for max_fail_percentage 13131 1726867231.60546: done checking for max_fail_percentage 13131 1726867231.60547: checking to see if all hosts have failed and the running result is not ok 13131 1726867231.60548: done checking to see if all hosts have failed 13131 1726867231.60549: getting the remaining hosts for this loop 13131 1726867231.60550: done getting the remaining hosts for this loop 13131 1726867231.60553: getting the next task for host managed_node1 13131 1726867231.60560: done getting next task for host managed_node1 13131 1726867231.60564: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867231.60569: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867231.60597: getting variables 13131 1726867231.60599: in VariableManager get_vars() 13131 1726867231.60651: Calling all_inventory to load vars for managed_node1 13131 1726867231.60654: Calling groups_inventory to load vars for managed_node1 13131 1726867231.60656: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867231.60669: Calling all_plugins_play to load vars for managed_node1 13131 1726867231.60673: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867231.60676: Calling groups_plugins_play to load vars for managed_node1 13131 1726867231.64858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867231.69646: done with get_vars() 13131 1726867231.69954: done getting variables 13131 1726867231.70023: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:20:31 -0400 (0:00:00.145) 0:00:46.811 ****** 13131 1726867231.70061: entering _queue_task() for managed_node1/fail 13131 1726867231.70825: worker is 1 (out of 1 available) 13131 1726867231.70837: exiting _queue_task() for managed_node1/fail 13131 1726867231.70848: done queuing things up, now waiting for results queue to drain 13131 1726867231.70850: waiting for pending results... 13131 1726867231.71659: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13131 1726867231.72070: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000170 13131 1726867231.72085: variable 'ansible_search_path' from source: unknown 13131 1726867231.72088: variable 'ansible_search_path' from source: unknown 13131 1726867231.72284: calling self._execute() 13131 1726867231.72683: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867231.72689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867231.72692: variable 'omit' from source: magic vars 13131 1726867231.72894: variable 'ansible_distribution_major_version' from source: facts 13131 1726867231.72908: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867231.73109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867231.83563: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867231.83658: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867231.83707: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867231.83808: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867231.83851: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867231.83921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867231.84085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867231.84117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867231.84211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867231.84293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867231.84476: variable 'ansible_distribution_major_version' from source: facts 13131 1726867231.84707: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13131 1726867231.84864: variable 'ansible_distribution' from source: facts 13131 1726867231.84875: variable '__network_rh_distros' from source: role '' defaults 13131 1726867231.84944: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13131 1726867231.85490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867231.85527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867231.85558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867231.85614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867231.85648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867231.85722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867231.85760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867231.85850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867231.85863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867231.85884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867231.85936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867231.85971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867231.86003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867231.86129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867231.86133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867231.86405: variable 'network_connections' from source: task vars 13131 1726867231.86420: variable 'controller_profile' from source: play vars 13131 1726867231.86508: variable 'controller_profile' from source: play vars 13131 1726867231.86571: variable 'network_state' from source: role '' defaults 13131 1726867231.86616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867231.86801: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867231.86853: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867231.86899: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867231.86942: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867231.86989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867231.87037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867231.87115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867231.87119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867231.87133: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13131 1726867231.87148: when evaluation is False, skipping this task 13131 1726867231.87155: _execute() done 13131 1726867231.87161: dumping result to json 13131 1726867231.87166: done dumping result, returning 13131 1726867231.87178: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-5f24-9b7a-000000000170] 13131 1726867231.87193: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000170 13131 1726867231.87326: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000170 13131 1726867231.87329: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13131 1726867231.87480: no more pending results, returning what we have 13131 1726867231.87484: results queue empty 13131 1726867231.87485: checking for any_errors_fatal 13131 1726867231.87490: done checking for any_errors_fatal 13131 1726867231.87491: checking for max_fail_percentage 13131 1726867231.87493: done checking for max_fail_percentage 13131 1726867231.87494: checking to see if all hosts have failed and the running result is not ok 13131 1726867231.87495: done checking to see if all hosts have failed 13131 1726867231.87496: getting the remaining hosts for this loop 13131 1726867231.87497: done getting the remaining hosts for this loop 13131 1726867231.87501: getting the next task for host managed_node1 13131 1726867231.87507: done getting next task for host managed_node1 13131 1726867231.87511: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867231.87516: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867231.87536: getting variables 13131 1726867231.87538: in VariableManager get_vars() 13131 1726867231.87812: Calling all_inventory to load vars for managed_node1 13131 1726867231.87815: Calling groups_inventory to load vars for managed_node1 13131 1726867231.87818: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867231.87827: Calling all_plugins_play to load vars for managed_node1 13131 1726867231.87830: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867231.87833: Calling groups_plugins_play to load vars for managed_node1 13131 1726867231.95640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867231.97166: done with get_vars() 13131 1726867231.97189: done getting variables 13131 1726867231.97242: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:20:31 -0400 (0:00:00.272) 0:00:47.083 ****** 13131 1726867231.97272: entering _queue_task() for managed_node1/dnf 13131 1726867231.97627: worker is 1 (out of 1 available) 13131 1726867231.97641: exiting _queue_task() for managed_node1/dnf 13131 1726867231.97654: done queuing things up, now waiting for results queue to drain 13131 1726867231.97655: waiting for pending results... 13131 1726867231.98139: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13131 1726867231.98144: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000171 13131 1726867231.98148: variable 'ansible_search_path' from source: unknown 13131 1726867231.98151: variable 'ansible_search_path' from source: unknown 13131 1726867231.98196: calling self._execute() 13131 1726867231.98266: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867231.98274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867231.98288: variable 'omit' from source: magic vars 13131 1726867231.98697: variable 'ansible_distribution_major_version' from source: facts 13131 1726867231.98739: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867231.98915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867232.01749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867232.01829: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867232.01864: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867232.01902: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867232.01929: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867232.02010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.02037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.02072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.02109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.02183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.02343: variable 'ansible_distribution' from source: facts 13131 1726867232.02372: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.02442: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13131 1726867232.02544: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867232.02936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.03120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.03196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.03235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.03253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.03883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.03887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.03889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.03908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.03919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.03959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.04144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.04150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.04152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.04158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.04629: variable 'network_connections' from source: task vars 13131 1726867232.04687: variable 'controller_profile' from source: play vars 13131 1726867232.04826: variable 'controller_profile' from source: play vars 13131 1726867232.05061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867232.05222: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867232.05257: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867232.05289: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867232.05340: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867232.05370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867232.05393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867232.05661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.05665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867232.05726: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867232.06449: variable 'network_connections' from source: task vars 13131 1726867232.06454: variable 'controller_profile' from source: play vars 13131 1726867232.06857: variable 'controller_profile' from source: play vars 13131 1726867232.06860: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867232.06863: when evaluation is False, skipping this task 13131 1726867232.06865: _execute() done 13131 1726867232.06867: dumping result to json 13131 1726867232.06869: done dumping result, returning 13131 1726867232.06872: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000171] 13131 1726867232.06874: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000171 13131 1726867232.06996: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000171 13131 1726867232.06999: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867232.07066: no more pending results, returning what we have 13131 1726867232.07070: results queue empty 13131 1726867232.07071: checking for any_errors_fatal 13131 1726867232.07083: done checking for any_errors_fatal 13131 1726867232.07084: checking for max_fail_percentage 13131 1726867232.07086: done checking for max_fail_percentage 13131 1726867232.07087: checking to see if all hosts have failed and the running result is not ok 13131 1726867232.07088: done checking to see if all hosts have failed 13131 1726867232.07088: getting the remaining hosts for this loop 13131 1726867232.07090: done getting the remaining hosts for this loop 13131 1726867232.07093: getting the next task for host managed_node1 13131 1726867232.07101: done getting next task for host managed_node1 13131 1726867232.07108: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867232.07112: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867232.07137: getting variables 13131 1726867232.07139: in VariableManager get_vars() 13131 1726867232.07298: Calling all_inventory to load vars for managed_node1 13131 1726867232.07301: Calling groups_inventory to load vars for managed_node1 13131 1726867232.07306: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867232.07316: Calling all_plugins_play to load vars for managed_node1 13131 1726867232.07319: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867232.07321: Calling groups_plugins_play to load vars for managed_node1 13131 1726867232.10501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867232.13753: done with get_vars() 13131 1726867232.13775: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13131 1726867232.13864: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:20:32 -0400 (0:00:00.171) 0:00:47.254 ****** 13131 1726867232.14380: entering _queue_task() for managed_node1/yum 13131 1726867232.15325: worker is 1 (out of 1 available) 13131 1726867232.15336: exiting _queue_task() for managed_node1/yum 13131 1726867232.15347: done queuing things up, now waiting for results queue to drain 13131 1726867232.15348: waiting for pending results... 13131 1726867232.16362: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13131 1726867232.16368: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000172 13131 1726867232.16374: variable 'ansible_search_path' from source: unknown 13131 1726867232.16380: variable 'ansible_search_path' from source: unknown 13131 1726867232.16628: calling self._execute() 13131 1726867232.17023: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867232.17027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867232.17031: variable 'omit' from source: magic vars 13131 1726867232.17882: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.17886: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867232.18137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867232.21530: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867232.21825: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867232.21863: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867232.21899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867232.22007: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867232.22383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.22426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.22446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.22847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.22985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.23025: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.23039: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13131 1726867232.23042: when evaluation is False, skipping this task 13131 1726867232.23044: _execute() done 13131 1726867232.23047: dumping result to json 13131 1726867232.23049: done dumping result, returning 13131 1726867232.23058: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000172] 13131 1726867232.23061: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000172 13131 1726867232.23166: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000172 13131 1726867232.23170: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13131 1726867232.23229: no more pending results, returning what we have 13131 1726867232.23233: results queue empty 13131 1726867232.23235: checking for any_errors_fatal 13131 1726867232.23240: done checking for any_errors_fatal 13131 1726867232.23241: checking for max_fail_percentage 13131 1726867232.23243: done checking for max_fail_percentage 13131 1726867232.23244: checking to see if all hosts have failed and the running result is not ok 13131 1726867232.23244: done checking to see if all hosts have failed 13131 1726867232.23245: getting the remaining hosts for this loop 13131 1726867232.23246: done getting the remaining hosts for this loop 13131 1726867232.23251: getting the next task for host managed_node1 13131 1726867232.23258: done getting next task for host managed_node1 13131 1726867232.23262: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867232.23266: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867232.23290: getting variables 13131 1726867232.23292: in VariableManager get_vars() 13131 1726867232.23347: Calling all_inventory to load vars for managed_node1 13131 1726867232.23350: Calling groups_inventory to load vars for managed_node1 13131 1726867232.23352: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867232.23363: Calling all_plugins_play to load vars for managed_node1 13131 1726867232.23366: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867232.23368: Calling groups_plugins_play to load vars for managed_node1 13131 1726867232.28106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867232.32517: done with get_vars() 13131 1726867232.32544: done getting variables 13131 1726867232.32646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:20:32 -0400 (0:00:00.183) 0:00:47.437 ****** 13131 1726867232.32809: entering _queue_task() for managed_node1/fail 13131 1726867232.33620: worker is 1 (out of 1 available) 13131 1726867232.33632: exiting _queue_task() for managed_node1/fail 13131 1726867232.33643: done queuing things up, now waiting for results queue to drain 13131 1726867232.33644: waiting for pending results... 13131 1726867232.34490: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13131 1726867232.34910: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000173 13131 1726867232.34922: variable 'ansible_search_path' from source: unknown 13131 1726867232.34925: variable 'ansible_search_path' from source: unknown 13131 1726867232.35019: calling self._execute() 13131 1726867232.35235: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867232.35238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867232.35249: variable 'omit' from source: magic vars 13131 1726867232.35821: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.35831: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867232.35959: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867232.36161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867232.38544: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867232.38687: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867232.38691: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867232.38710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867232.39031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867232.39035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.39039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.39041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.39044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.39046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.39049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.39051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.39188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.39230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.39249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.39358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.39361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.39364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.39420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.39435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.39690: variable 'network_connections' from source: task vars 13131 1726867232.39694: variable 'controller_profile' from source: play vars 13131 1726867232.39732: variable 'controller_profile' from source: play vars 13131 1726867232.39810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867232.39998: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867232.40422: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867232.40451: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867232.40487: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867232.40535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867232.40559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867232.40581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.40728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867232.40731: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867232.40948: variable 'network_connections' from source: task vars 13131 1726867232.40953: variable 'controller_profile' from source: play vars 13131 1726867232.41065: variable 'controller_profile' from source: play vars 13131 1726867232.41091: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867232.41095: when evaluation is False, skipping this task 13131 1726867232.41099: _execute() done 13131 1726867232.41102: dumping result to json 13131 1726867232.41105: done dumping result, returning 13131 1726867232.41184: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000173] 13131 1726867232.41187: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000173 13131 1726867232.41529: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000173 13131 1726867232.41532: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867232.41588: no more pending results, returning what we have 13131 1726867232.41591: results queue empty 13131 1726867232.41592: checking for any_errors_fatal 13131 1726867232.41598: done checking for any_errors_fatal 13131 1726867232.41598: checking for max_fail_percentage 13131 1726867232.41600: done checking for max_fail_percentage 13131 1726867232.41601: checking to see if all hosts have failed and the running result is not ok 13131 1726867232.41602: done checking to see if all hosts have failed 13131 1726867232.41602: getting the remaining hosts for this loop 13131 1726867232.41605: done getting the remaining hosts for this loop 13131 1726867232.41609: getting the next task for host managed_node1 13131 1726867232.41614: done getting next task for host managed_node1 13131 1726867232.41618: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13131 1726867232.41622: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867232.41639: getting variables 13131 1726867232.41640: in VariableManager get_vars() 13131 1726867232.41826: Calling all_inventory to load vars for managed_node1 13131 1726867232.41830: Calling groups_inventory to load vars for managed_node1 13131 1726867232.41833: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867232.41842: Calling all_plugins_play to load vars for managed_node1 13131 1726867232.41845: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867232.41848: Calling groups_plugins_play to load vars for managed_node1 13131 1726867232.43485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867232.47868: done with get_vars() 13131 1726867232.47924: done getting variables 13131 1726867232.47988: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:20:32 -0400 (0:00:00.153) 0:00:47.590 ****** 13131 1726867232.48029: entering _queue_task() for managed_node1/package 13131 1726867232.49062: worker is 1 (out of 1 available) 13131 1726867232.49073: exiting _queue_task() for managed_node1/package 13131 1726867232.49084: done queuing things up, now waiting for results queue to drain 13131 1726867232.49085: waiting for pending results... 13131 1726867232.49694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13131 1726867232.49824: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000174 13131 1726867232.49846: variable 'ansible_search_path' from source: unknown 13131 1726867232.49855: variable 'ansible_search_path' from source: unknown 13131 1726867232.49900: calling self._execute() 13131 1726867232.50295: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867232.50300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867232.50303: variable 'omit' from source: magic vars 13131 1726867232.50940: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.51213: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867232.51621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867232.52406: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867232.52465: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867232.52469: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867232.52541: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867232.52828: variable 'network_packages' from source: role '' defaults 13131 1726867232.52979: variable '__network_provider_setup' from source: role '' defaults 13131 1726867232.52999: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867232.53079: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867232.53131: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867232.53345: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867232.53826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867232.56692: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867232.56764: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867232.56813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867232.56853: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867232.56886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867232.56987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.57038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.57061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.57146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.57296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.57324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.57369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.57487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.57500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.57525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.57826: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867232.57964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.58000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.58045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.58119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.58150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.58261: variable 'ansible_python' from source: facts 13131 1726867232.58345: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867232.58400: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867232.58526: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867232.58730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.58766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.58865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.59205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.59209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.59211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.59422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.59424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.59427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.59429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.59729: variable 'network_connections' from source: task vars 13131 1726867232.59783: variable 'controller_profile' from source: play vars 13131 1726867232.60031: variable 'controller_profile' from source: play vars 13131 1726867232.60187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867232.60227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867232.60261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.60328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867232.60401: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867232.60902: variable 'network_connections' from source: task vars 13131 1726867232.60908: variable 'controller_profile' from source: play vars 13131 1726867232.61061: variable 'controller_profile' from source: play vars 13131 1726867232.61121: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867232.61241: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867232.61780: variable 'network_connections' from source: task vars 13131 1726867232.61799: variable 'controller_profile' from source: play vars 13131 1726867232.61883: variable 'controller_profile' from source: play vars 13131 1726867232.61886: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867232.61970: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867232.62295: variable 'network_connections' from source: task vars 13131 1726867232.62309: variable 'controller_profile' from source: play vars 13131 1726867232.62456: variable 'controller_profile' from source: play vars 13131 1726867232.62460: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867232.62536: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867232.62548: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867232.62621: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867232.62840: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867232.63430: variable 'network_connections' from source: task vars 13131 1726867232.63442: variable 'controller_profile' from source: play vars 13131 1726867232.63509: variable 'controller_profile' from source: play vars 13131 1726867232.63523: variable 'ansible_distribution' from source: facts 13131 1726867232.63531: variable '__network_rh_distros' from source: role '' defaults 13131 1726867232.63552: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.63583: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867232.63761: variable 'ansible_distribution' from source: facts 13131 1726867232.63862: variable '__network_rh_distros' from source: role '' defaults 13131 1726867232.63865: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.63867: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867232.64106: variable 'ansible_distribution' from source: facts 13131 1726867232.64117: variable '__network_rh_distros' from source: role '' defaults 13131 1726867232.64128: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.64183: variable 'network_provider' from source: set_fact 13131 1726867232.64219: variable 'ansible_facts' from source: unknown 13131 1726867232.65018: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13131 1726867232.65033: when evaluation is False, skipping this task 13131 1726867232.65087: _execute() done 13131 1726867232.65090: dumping result to json 13131 1726867232.65093: done dumping result, returning 13131 1726867232.65095: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-5f24-9b7a-000000000174] 13131 1726867232.65098: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000174 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13131 1726867232.65360: no more pending results, returning what we have 13131 1726867232.65364: results queue empty 13131 1726867232.65365: checking for any_errors_fatal 13131 1726867232.65373: done checking for any_errors_fatal 13131 1726867232.65373: checking for max_fail_percentage 13131 1726867232.65375: done checking for max_fail_percentage 13131 1726867232.65376: checking to see if all hosts have failed and the running result is not ok 13131 1726867232.65379: done checking to see if all hosts have failed 13131 1726867232.65380: getting the remaining hosts for this loop 13131 1726867232.65381: done getting the remaining hosts for this loop 13131 1726867232.65385: getting the next task for host managed_node1 13131 1726867232.65394: done getting next task for host managed_node1 13131 1726867232.65398: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867232.65401: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867232.65432: getting variables 13131 1726867232.65434: in VariableManager get_vars() 13131 1726867232.65710: Calling all_inventory to load vars for managed_node1 13131 1726867232.65713: Calling groups_inventory to load vars for managed_node1 13131 1726867232.65715: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867232.65725: Calling all_plugins_play to load vars for managed_node1 13131 1726867232.65738: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867232.65732: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000174 13131 1726867232.65742: Calling groups_plugins_play to load vars for managed_node1 13131 1726867232.65771: WORKER PROCESS EXITING 13131 1726867232.68349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867232.70419: done with get_vars() 13131 1726867232.70462: done getting variables 13131 1726867232.70529: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:20:32 -0400 (0:00:00.225) 0:00:47.816 ****** 13131 1726867232.70581: entering _queue_task() for managed_node1/package 13131 1726867232.71009: worker is 1 (out of 1 available) 13131 1726867232.71023: exiting _queue_task() for managed_node1/package 13131 1726867232.71035: done queuing things up, now waiting for results queue to drain 13131 1726867232.71036: waiting for pending results... 13131 1726867232.71260: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13131 1726867232.71489: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000175 13131 1726867232.71493: variable 'ansible_search_path' from source: unknown 13131 1726867232.71496: variable 'ansible_search_path' from source: unknown 13131 1726867232.71498: calling self._execute() 13131 1726867232.71545: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867232.71551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867232.71560: variable 'omit' from source: magic vars 13131 1726867232.71993: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.72007: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867232.72130: variable 'network_state' from source: role '' defaults 13131 1726867232.72139: Evaluated conditional (network_state != {}): False 13131 1726867232.72141: when evaluation is False, skipping this task 13131 1726867232.72144: _execute() done 13131 1726867232.72147: dumping result to json 13131 1726867232.72149: done dumping result, returning 13131 1726867232.72158: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-000000000175] 13131 1726867232.72163: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000175 13131 1726867232.72291: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000175 13131 1726867232.72296: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867232.72432: no more pending results, returning what we have 13131 1726867232.72438: results queue empty 13131 1726867232.72439: checking for any_errors_fatal 13131 1726867232.72448: done checking for any_errors_fatal 13131 1726867232.72451: checking for max_fail_percentage 13131 1726867232.72453: done checking for max_fail_percentage 13131 1726867232.72454: checking to see if all hosts have failed and the running result is not ok 13131 1726867232.72455: done checking to see if all hosts have failed 13131 1726867232.72455: getting the remaining hosts for this loop 13131 1726867232.72456: done getting the remaining hosts for this loop 13131 1726867232.72459: getting the next task for host managed_node1 13131 1726867232.72466: done getting next task for host managed_node1 13131 1726867232.72468: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867232.72474: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867232.72497: getting variables 13131 1726867232.72498: in VariableManager get_vars() 13131 1726867232.72543: Calling all_inventory to load vars for managed_node1 13131 1726867232.72546: Calling groups_inventory to load vars for managed_node1 13131 1726867232.72548: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867232.72557: Calling all_plugins_play to load vars for managed_node1 13131 1726867232.72560: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867232.72562: Calling groups_plugins_play to load vars for managed_node1 13131 1726867232.74565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867232.76358: done with get_vars() 13131 1726867232.76390: done getting variables 13131 1726867232.76455: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:20:32 -0400 (0:00:00.059) 0:00:47.875 ****** 13131 1726867232.76489: entering _queue_task() for managed_node1/package 13131 1726867232.76812: worker is 1 (out of 1 available) 13131 1726867232.76826: exiting _queue_task() for managed_node1/package 13131 1726867232.76949: done queuing things up, now waiting for results queue to drain 13131 1726867232.76951: waiting for pending results... 13131 1726867232.77512: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13131 1726867232.77518: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000176 13131 1726867232.77522: variable 'ansible_search_path' from source: unknown 13131 1726867232.77525: variable 'ansible_search_path' from source: unknown 13131 1726867232.77528: calling self._execute() 13131 1726867232.77619: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867232.77623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867232.77647: variable 'omit' from source: magic vars 13131 1726867232.78105: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.78120: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867232.78268: variable 'network_state' from source: role '' defaults 13131 1726867232.78279: Evaluated conditional (network_state != {}): False 13131 1726867232.78282: when evaluation is False, skipping this task 13131 1726867232.78285: _execute() done 13131 1726867232.78287: dumping result to json 13131 1726867232.78294: done dumping result, returning 13131 1726867232.78302: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-5f24-9b7a-000000000176] 13131 1726867232.78310: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000176 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867232.78613: no more pending results, returning what we have 13131 1726867232.78617: results queue empty 13131 1726867232.78618: checking for any_errors_fatal 13131 1726867232.78623: done checking for any_errors_fatal 13131 1726867232.78624: checking for max_fail_percentage 13131 1726867232.78626: done checking for max_fail_percentage 13131 1726867232.78627: checking to see if all hosts have failed and the running result is not ok 13131 1726867232.78627: done checking to see if all hosts have failed 13131 1726867232.78628: getting the remaining hosts for this loop 13131 1726867232.78629: done getting the remaining hosts for this loop 13131 1726867232.78632: getting the next task for host managed_node1 13131 1726867232.78637: done getting next task for host managed_node1 13131 1726867232.78640: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867232.78643: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867232.78660: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000176 13131 1726867232.78663: WORKER PROCESS EXITING 13131 1726867232.78708: getting variables 13131 1726867232.78710: in VariableManager get_vars() 13131 1726867232.78748: Calling all_inventory to load vars for managed_node1 13131 1726867232.78750: Calling groups_inventory to load vars for managed_node1 13131 1726867232.78764: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867232.78779: Calling all_plugins_play to load vars for managed_node1 13131 1726867232.78782: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867232.78785: Calling groups_plugins_play to load vars for managed_node1 13131 1726867232.80339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867232.82100: done with get_vars() 13131 1726867232.82126: done getting variables 13131 1726867232.82254: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:20:32 -0400 (0:00:00.058) 0:00:47.933 ****** 13131 1726867232.82293: entering _queue_task() for managed_node1/service 13131 1726867232.82638: worker is 1 (out of 1 available) 13131 1726867232.82655: exiting _queue_task() for managed_node1/service 13131 1726867232.82671: done queuing things up, now waiting for results queue to drain 13131 1726867232.82672: waiting for pending results... 13131 1726867232.83018: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13131 1726867232.83245: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000177 13131 1726867232.83251: variable 'ansible_search_path' from source: unknown 13131 1726867232.83256: variable 'ansible_search_path' from source: unknown 13131 1726867232.83259: calling self._execute() 13131 1726867232.83415: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867232.83418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867232.83422: variable 'omit' from source: magic vars 13131 1726867232.83887: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.83890: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867232.84091: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867232.84259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867232.87185: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867232.87189: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867232.87191: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867232.87193: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867232.87195: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867232.87389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.87393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.87395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.87398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.87400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.87455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.87481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.87504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.87547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.87560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.87605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867232.87626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867232.87652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.87701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867232.87717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867232.87899: variable 'network_connections' from source: task vars 13131 1726867232.87913: variable 'controller_profile' from source: play vars 13131 1726867232.87995: variable 'controller_profile' from source: play vars 13131 1726867232.88067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867232.88261: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867232.88299: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867232.88340: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867232.88368: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867232.88421: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867232.88437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867232.88461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867232.88486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867232.88536: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867232.88897: variable 'network_connections' from source: task vars 13131 1726867232.88919: variable 'controller_profile' from source: play vars 13131 1726867232.88993: variable 'controller_profile' from source: play vars 13131 1726867232.89033: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13131 1726867232.89042: when evaluation is False, skipping this task 13131 1726867232.89049: _execute() done 13131 1726867232.89056: dumping result to json 13131 1726867232.89063: done dumping result, returning 13131 1726867232.89081: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-5f24-9b7a-000000000177] 13131 1726867232.89095: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000177 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13131 1726867232.89354: no more pending results, returning what we have 13131 1726867232.89357: results queue empty 13131 1726867232.89359: checking for any_errors_fatal 13131 1726867232.89364: done checking for any_errors_fatal 13131 1726867232.89365: checking for max_fail_percentage 13131 1726867232.89368: done checking for max_fail_percentage 13131 1726867232.89369: checking to see if all hosts have failed and the running result is not ok 13131 1726867232.89370: done checking to see if all hosts have failed 13131 1726867232.89370: getting the remaining hosts for this loop 13131 1726867232.89372: done getting the remaining hosts for this loop 13131 1726867232.89375: getting the next task for host managed_node1 13131 1726867232.89459: done getting next task for host managed_node1 13131 1726867232.89464: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867232.89468: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867232.89493: getting variables 13131 1726867232.89495: in VariableManager get_vars() 13131 1726867232.89669: Calling all_inventory to load vars for managed_node1 13131 1726867232.89672: Calling groups_inventory to load vars for managed_node1 13131 1726867232.89675: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867232.89686: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000177 13131 1726867232.89689: WORKER PROCESS EXITING 13131 1726867232.89699: Calling all_plugins_play to load vars for managed_node1 13131 1726867232.89702: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867232.89705: Calling groups_plugins_play to load vars for managed_node1 13131 1726867232.91781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867232.94101: done with get_vars() 13131 1726867232.94128: done getting variables 13131 1726867232.94231: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:20:32 -0400 (0:00:00.119) 0:00:48.053 ****** 13131 1726867232.94269: entering _queue_task() for managed_node1/service 13131 1726867232.94822: worker is 1 (out of 1 available) 13131 1726867232.94833: exiting _queue_task() for managed_node1/service 13131 1726867232.94981: done queuing things up, now waiting for results queue to drain 13131 1726867232.94983: waiting for pending results... 13131 1726867232.95215: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13131 1726867232.95375: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000178 13131 1726867232.95404: variable 'ansible_search_path' from source: unknown 13131 1726867232.95441: variable 'ansible_search_path' from source: unknown 13131 1726867232.95461: calling self._execute() 13131 1726867232.95583: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867232.95597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867232.95658: variable 'omit' from source: magic vars 13131 1726867232.96107: variable 'ansible_distribution_major_version' from source: facts 13131 1726867232.96125: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867232.96324: variable 'network_provider' from source: set_fact 13131 1726867232.96375: variable 'network_state' from source: role '' defaults 13131 1726867232.96381: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13131 1726867232.96384: variable 'omit' from source: magic vars 13131 1726867232.96439: variable 'omit' from source: magic vars 13131 1726867232.96485: variable 'network_service_name' from source: role '' defaults 13131 1726867232.96594: variable 'network_service_name' from source: role '' defaults 13131 1726867232.96661: variable '__network_provider_setup' from source: role '' defaults 13131 1726867232.96671: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867232.96740: variable '__network_service_name_default_nm' from source: role '' defaults 13131 1726867232.96752: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867232.96819: variable '__network_packages_default_nm' from source: role '' defaults 13131 1726867232.97043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867233.00236: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867233.00319: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867233.00364: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867233.00420: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867233.00541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867233.00683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867233.00733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867233.00784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.00839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867233.00865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867233.00973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867233.00976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867233.00988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.01035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867233.01054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867233.01328: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13131 1726867233.01451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867233.01502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867233.01630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.01634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867233.01637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867233.01711: variable 'ansible_python' from source: facts 13131 1726867233.01746: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13131 1726867233.01836: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867233.01970: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867233.02079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867233.02173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867233.02176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.02182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867233.02200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867233.02252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867233.02407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867233.02410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.02412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867233.02414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867233.02552: variable 'network_connections' from source: task vars 13131 1726867233.02568: variable 'controller_profile' from source: play vars 13131 1726867233.02651: variable 'controller_profile' from source: play vars 13131 1726867233.02767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867233.03188: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867233.03220: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867233.03330: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867233.03376: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867233.03550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867233.03617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867233.03841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.03845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867233.03892: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867233.04328: variable 'network_connections' from source: task vars 13131 1726867233.04351: variable 'controller_profile' from source: play vars 13131 1726867233.04434: variable 'controller_profile' from source: play vars 13131 1726867233.04476: variable '__network_packages_default_wireless' from source: role '' defaults 13131 1726867233.04570: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867233.05080: variable 'network_connections' from source: task vars 13131 1726867233.05196: variable 'controller_profile' from source: play vars 13131 1726867233.05199: variable 'controller_profile' from source: play vars 13131 1726867233.05201: variable '__network_packages_default_team' from source: role '' defaults 13131 1726867233.05360: variable '__network_team_connections_defined' from source: role '' defaults 13131 1726867233.05948: variable 'network_connections' from source: task vars 13131 1726867233.06032: variable 'controller_profile' from source: play vars 13131 1726867233.06115: variable 'controller_profile' from source: play vars 13131 1726867233.06174: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867233.06328: variable '__network_service_name_default_initscripts' from source: role '' defaults 13131 1726867233.06411: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867233.06785: variable '__network_packages_default_initscripts' from source: role '' defaults 13131 1726867233.06969: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13131 1726867233.08692: variable 'network_connections' from source: task vars 13131 1726867233.08796: variable 'controller_profile' from source: play vars 13131 1726867233.08866: variable 'controller_profile' from source: play vars 13131 1726867233.08882: variable 'ansible_distribution' from source: facts 13131 1726867233.08902: variable '__network_rh_distros' from source: role '' defaults 13131 1726867233.08986: variable 'ansible_distribution_major_version' from source: facts 13131 1726867233.09011: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13131 1726867233.09436: variable 'ansible_distribution' from source: facts 13131 1726867233.09439: variable '__network_rh_distros' from source: role '' defaults 13131 1726867233.09440: variable 'ansible_distribution_major_version' from source: facts 13131 1726867233.09449: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13131 1726867233.09748: variable 'ansible_distribution' from source: facts 13131 1726867233.09788: variable '__network_rh_distros' from source: role '' defaults 13131 1726867233.09807: variable 'ansible_distribution_major_version' from source: facts 13131 1726867233.09842: variable 'network_provider' from source: set_fact 13131 1726867233.09874: variable 'omit' from source: magic vars 13131 1726867233.09915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867233.09946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867233.09968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867233.09994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867233.10083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867233.10086: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867233.10088: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867233.10089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867233.10156: Set connection var ansible_connection to ssh 13131 1726867233.10174: Set connection var ansible_timeout to 10 13131 1726867233.10192: Set connection var ansible_shell_type to sh 13131 1726867233.10209: Set connection var ansible_shell_executable to /bin/sh 13131 1726867233.10224: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867233.10240: Set connection var ansible_pipelining to False 13131 1726867233.10268: variable 'ansible_shell_executable' from source: unknown 13131 1726867233.10279: variable 'ansible_connection' from source: unknown 13131 1726867233.10287: variable 'ansible_module_compression' from source: unknown 13131 1726867233.10342: variable 'ansible_shell_type' from source: unknown 13131 1726867233.10345: variable 'ansible_shell_executable' from source: unknown 13131 1726867233.10347: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867233.10349: variable 'ansible_pipelining' from source: unknown 13131 1726867233.10351: variable 'ansible_timeout' from source: unknown 13131 1726867233.10354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867233.10460: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867233.10484: variable 'omit' from source: magic vars 13131 1726867233.10496: starting attempt loop 13131 1726867233.10502: running the handler 13131 1726867233.10626: variable 'ansible_facts' from source: unknown 13131 1726867233.11882: _low_level_execute_command(): starting 13131 1726867233.11886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867233.13457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867233.13681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867233.13781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867233.15460: stdout chunk (state=3): >>>/root <<< 13131 1726867233.15561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867233.15786: stderr chunk (state=3): >>><<< 13131 1726867233.15790: stdout chunk (state=3): >>><<< 13131 1726867233.15793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867233.15796: _low_level_execute_command(): starting 13131 1726867233.15799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198 `" && echo ansible-tmp-1726867233.1565475-15400-192435399837198="` echo /root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198 `" ) && sleep 0' 13131 1726867233.16334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867233.16343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867233.16584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867233.16588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867233.16684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867233.16926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867233.17175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867233.19029: stdout chunk (state=3): >>>ansible-tmp-1726867233.1565475-15400-192435399837198=/root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198 <<< 13131 1726867233.19344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867233.19347: stdout chunk (state=3): >>><<< 13131 1726867233.19356: stderr chunk (state=3): >>><<< 13131 1726867233.19369: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867233.1565475-15400-192435399837198=/root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867233.19464: variable 'ansible_module_compression' from source: unknown 13131 1726867233.19705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13131 1726867233.19856: variable 'ansible_facts' from source: unknown 13131 1726867233.20281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/AnsiballZ_systemd.py 13131 1726867233.21027: Sending initial data 13131 1726867233.21030: Sent initial data (156 bytes) 13131 1726867233.22264: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867233.22299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867233.22340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867233.22590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867233.24217: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13131 1726867233.24221: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867233.24536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867233.24540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmph0ovqk9l /root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/AnsiballZ_systemd.py <<< 13131 1726867233.24543: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/AnsiballZ_systemd.py" <<< 13131 1726867233.25004: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmph0ovqk9l" to remote "/root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/AnsiballZ_systemd.py" <<< 13131 1726867233.29508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867233.29576: stderr chunk (state=3): >>><<< 13131 1726867233.29806: stdout chunk (state=3): >>><<< 13131 1726867233.29810: done transferring module to remote 13131 1726867233.29812: _low_level_execute_command(): starting 13131 1726867233.29815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/ /root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/AnsiballZ_systemd.py && sleep 0' 13131 1726867233.31145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867233.31214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867233.31253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867233.31359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867233.31793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867233.33660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867233.33694: stderr chunk (state=3): >>><<< 13131 1726867233.33706: stdout chunk (state=3): >>><<< 13131 1726867233.33728: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867233.33765: _low_level_execute_command(): starting 13131 1726867233.33775: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/AnsiballZ_systemd.py && sleep 0' 13131 1726867233.35197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867233.35201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867233.35295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867233.35317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867233.35370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867233.35453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867233.64311: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10874880", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3307520000", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1097193000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 13131 1726867233.64341: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-b<<< 13131 1726867233.64393: stdout chunk (state=3): >>>roker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13131 1726867233.66216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867233.66320: stderr chunk (state=3): >>><<< 13131 1726867233.66323: stdout chunk (state=3): >>><<< 13131 1726867233.66488: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10874880", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3307520000", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1097193000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867233.67067: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867233.67182: _low_level_execute_command(): starting 13131 1726867233.67185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867233.1565475-15400-192435399837198/ > /dev/null 2>&1 && sleep 0' 13131 1726867233.67896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867233.67913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867233.67936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867233.67963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867233.68047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867233.68134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867233.68169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867233.68245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867233.70186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867233.70189: stdout chunk (state=3): >>><<< 13131 1726867233.70333: stderr chunk (state=3): >>><<< 13131 1726867233.70337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867233.70340: handler run complete 13131 1726867233.70342: attempt loop complete, returning result 13131 1726867233.70344: _execute() done 13131 1726867233.70346: dumping result to json 13131 1726867233.70348: done dumping result, returning 13131 1726867233.70350: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-5f24-9b7a-000000000178] 13131 1726867233.70352: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000178 13131 1726867233.71023: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000178 13131 1726867233.71027: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867233.71094: no more pending results, returning what we have 13131 1726867233.71098: results queue empty 13131 1726867233.71099: checking for any_errors_fatal 13131 1726867233.71106: done checking for any_errors_fatal 13131 1726867233.71107: checking for max_fail_percentage 13131 1726867233.71110: done checking for max_fail_percentage 13131 1726867233.71111: checking to see if all hosts have failed and the running result is not ok 13131 1726867233.71112: done checking to see if all hosts have failed 13131 1726867233.71112: getting the remaining hosts for this loop 13131 1726867233.71114: done getting the remaining hosts for this loop 13131 1726867233.71117: getting the next task for host managed_node1 13131 1726867233.71209: done getting next task for host managed_node1 13131 1726867233.71213: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867233.71217: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867233.71233: getting variables 13131 1726867233.71235: in VariableManager get_vars() 13131 1726867233.71589: Calling all_inventory to load vars for managed_node1 13131 1726867233.71592: Calling groups_inventory to load vars for managed_node1 13131 1726867233.71594: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867233.71689: Calling all_plugins_play to load vars for managed_node1 13131 1726867233.71692: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867233.71695: Calling groups_plugins_play to load vars for managed_node1 13131 1726867233.74619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867233.78226: done with get_vars() 13131 1726867233.78254: done getting variables 13131 1726867233.78444: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:20:33 -0400 (0:00:00.843) 0:00:48.896 ****** 13131 1726867233.78613: entering _queue_task() for managed_node1/service 13131 1726867233.79337: worker is 1 (out of 1 available) 13131 1726867233.79352: exiting _queue_task() for managed_node1/service 13131 1726867233.79437: done queuing things up, now waiting for results queue to drain 13131 1726867233.79439: waiting for pending results... 13131 1726867233.79713: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13131 1726867233.79843: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000179 13131 1726867233.79919: variable 'ansible_search_path' from source: unknown 13131 1726867233.79928: variable 'ansible_search_path' from source: unknown 13131 1726867233.79932: calling self._execute() 13131 1726867233.80044: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867233.80058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867233.80076: variable 'omit' from source: magic vars 13131 1726867233.80508: variable 'ansible_distribution_major_version' from source: facts 13131 1726867233.80526: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867233.80784: variable 'network_provider' from source: set_fact 13131 1726867233.80788: Evaluated conditional (network_provider == "nm"): True 13131 1726867233.80813: variable '__network_wpa_supplicant_required' from source: role '' defaults 13131 1726867233.80917: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13131 1726867233.81181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867233.84422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867233.84499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867233.84537: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867233.84584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867233.84615: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867233.84939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867233.84943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867233.84946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.84948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867233.84951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867233.84954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867233.84956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867233.84958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.85014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867233.85031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867233.85070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867233.85095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867233.85137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.85174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867233.85190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867233.85364: variable 'network_connections' from source: task vars 13131 1726867233.85378: variable 'controller_profile' from source: play vars 13131 1726867233.85478: variable 'controller_profile' from source: play vars 13131 1726867233.85750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13131 1726867233.85993: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13131 1726867233.86083: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13131 1726867233.86155: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13131 1726867233.86220: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13131 1726867233.86483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13131 1726867233.86486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13131 1726867233.86489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867233.86491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13131 1726867233.86563: variable '__network_wireless_connections_defined' from source: role '' defaults 13131 1726867233.87385: variable 'network_connections' from source: task vars 13131 1726867233.87389: variable 'controller_profile' from source: play vars 13131 1726867233.87471: variable 'controller_profile' from source: play vars 13131 1726867233.87510: Evaluated conditional (__network_wpa_supplicant_required): False 13131 1726867233.87513: when evaluation is False, skipping this task 13131 1726867233.87516: _execute() done 13131 1726867233.87518: dumping result to json 13131 1726867233.87520: done dumping result, returning 13131 1726867233.87530: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-5f24-9b7a-000000000179] 13131 1726867233.87540: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000179 13131 1726867233.87863: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000179 13131 1726867233.87866: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13131 1726867233.87967: no more pending results, returning what we have 13131 1726867233.87971: results queue empty 13131 1726867233.87972: checking for any_errors_fatal 13131 1726867233.88002: done checking for any_errors_fatal 13131 1726867233.88003: checking for max_fail_percentage 13131 1726867233.88006: done checking for max_fail_percentage 13131 1726867233.88006: checking to see if all hosts have failed and the running result is not ok 13131 1726867233.88007: done checking to see if all hosts have failed 13131 1726867233.88008: getting the remaining hosts for this loop 13131 1726867233.88009: done getting the remaining hosts for this loop 13131 1726867233.88013: getting the next task for host managed_node1 13131 1726867233.88027: done getting next task for host managed_node1 13131 1726867233.88031: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867233.88035: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867233.88066: getting variables 13131 1726867233.88068: in VariableManager get_vars() 13131 1726867233.88360: Calling all_inventory to load vars for managed_node1 13131 1726867233.88364: Calling groups_inventory to load vars for managed_node1 13131 1726867233.88367: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867233.88381: Calling all_plugins_play to load vars for managed_node1 13131 1726867233.88385: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867233.88389: Calling groups_plugins_play to load vars for managed_node1 13131 1726867233.90240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867233.93127: done with get_vars() 13131 1726867233.93269: done getting variables 13131 1726867233.93333: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:20:33 -0400 (0:00:00.147) 0:00:49.044 ****** 13131 1726867233.93485: entering _queue_task() for managed_node1/service 13131 1726867233.94202: worker is 1 (out of 1 available) 13131 1726867233.94215: exiting _queue_task() for managed_node1/service 13131 1726867233.94227: done queuing things up, now waiting for results queue to drain 13131 1726867233.94228: waiting for pending results... 13131 1726867233.94894: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13131 1726867233.95202: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000017a 13131 1726867233.95208: variable 'ansible_search_path' from source: unknown 13131 1726867233.95211: variable 'ansible_search_path' from source: unknown 13131 1726867233.95213: calling self._execute() 13131 1726867233.95385: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867233.95558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867233.95561: variable 'omit' from source: magic vars 13131 1726867233.96642: variable 'ansible_distribution_major_version' from source: facts 13131 1726867233.96646: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867233.96900: variable 'network_provider' from source: set_fact 13131 1726867233.96915: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867233.96924: when evaluation is False, skipping this task 13131 1726867233.96931: _execute() done 13131 1726867233.96938: dumping result to json 13131 1726867233.96945: done dumping result, returning 13131 1726867233.96956: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-5f24-9b7a-00000000017a] 13131 1726867233.96993: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017a 13131 1726867233.97351: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017a 13131 1726867233.97356: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13131 1726867233.97421: no more pending results, returning what we have 13131 1726867233.97424: results queue empty 13131 1726867233.97425: checking for any_errors_fatal 13131 1726867233.97432: done checking for any_errors_fatal 13131 1726867233.97433: checking for max_fail_percentage 13131 1726867233.97435: done checking for max_fail_percentage 13131 1726867233.97435: checking to see if all hosts have failed and the running result is not ok 13131 1726867233.97436: done checking to see if all hosts have failed 13131 1726867233.97437: getting the remaining hosts for this loop 13131 1726867233.97438: done getting the remaining hosts for this loop 13131 1726867233.97441: getting the next task for host managed_node1 13131 1726867233.97449: done getting next task for host managed_node1 13131 1726867233.97452: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867233.97457: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867233.97482: getting variables 13131 1726867233.97484: in VariableManager get_vars() 13131 1726867233.97538: Calling all_inventory to load vars for managed_node1 13131 1726867233.97541: Calling groups_inventory to load vars for managed_node1 13131 1726867233.97543: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867233.97782: Calling all_plugins_play to load vars for managed_node1 13131 1726867233.97793: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867233.97798: Calling groups_plugins_play to load vars for managed_node1 13131 1726867234.01894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867234.05573: done with get_vars() 13131 1726867234.05665: done getting variables 13131 1726867234.05835: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:20:34 -0400 (0:00:00.125) 0:00:49.169 ****** 13131 1726867234.05873: entering _queue_task() for managed_node1/copy 13131 1726867234.07064: worker is 1 (out of 1 available) 13131 1726867234.07076: exiting _queue_task() for managed_node1/copy 13131 1726867234.07254: done queuing things up, now waiting for results queue to drain 13131 1726867234.07255: waiting for pending results... 13131 1726867234.08095: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13131 1726867234.08100: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000017b 13131 1726867234.08106: variable 'ansible_search_path' from source: unknown 13131 1726867234.08484: variable 'ansible_search_path' from source: unknown 13131 1726867234.08488: calling self._execute() 13131 1726867234.08631: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867234.09283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867234.09286: variable 'omit' from source: magic vars 13131 1726867234.09660: variable 'ansible_distribution_major_version' from source: facts 13131 1726867234.09686: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867234.09983: variable 'network_provider' from source: set_fact 13131 1726867234.09987: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867234.09990: when evaluation is False, skipping this task 13131 1726867234.09992: _execute() done 13131 1726867234.09995: dumping result to json 13131 1726867234.09998: done dumping result, returning 13131 1726867234.10005: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-5f24-9b7a-00000000017b] 13131 1726867234.10008: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017b 13131 1726867234.10083: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017b 13131 1726867234.10087: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867234.10144: no more pending results, returning what we have 13131 1726867234.10148: results queue empty 13131 1726867234.10149: checking for any_errors_fatal 13131 1726867234.10155: done checking for any_errors_fatal 13131 1726867234.10156: checking for max_fail_percentage 13131 1726867234.10158: done checking for max_fail_percentage 13131 1726867234.10158: checking to see if all hosts have failed and the running result is not ok 13131 1726867234.10159: done checking to see if all hosts have failed 13131 1726867234.10160: getting the remaining hosts for this loop 13131 1726867234.10161: done getting the remaining hosts for this loop 13131 1726867234.10165: getting the next task for host managed_node1 13131 1726867234.10172: done getting next task for host managed_node1 13131 1726867234.10179: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867234.10183: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867234.10206: getting variables 13131 1726867234.10208: in VariableManager get_vars() 13131 1726867234.10468: Calling all_inventory to load vars for managed_node1 13131 1726867234.10471: Calling groups_inventory to load vars for managed_node1 13131 1726867234.10473: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867234.10555: Calling all_plugins_play to load vars for managed_node1 13131 1726867234.10559: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867234.10563: Calling groups_plugins_play to load vars for managed_node1 13131 1726867234.13448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867234.16889: done with get_vars() 13131 1726867234.16910: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:20:34 -0400 (0:00:00.111) 0:00:49.280 ****** 13131 1726867234.17014: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867234.17425: worker is 1 (out of 1 available) 13131 1726867234.17438: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13131 1726867234.17449: done queuing things up, now waiting for results queue to drain 13131 1726867234.17450: waiting for pending results... 13131 1726867234.17772: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13131 1726867234.18283: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000017c 13131 1726867234.18287: variable 'ansible_search_path' from source: unknown 13131 1726867234.18290: variable 'ansible_search_path' from source: unknown 13131 1726867234.18292: calling self._execute() 13131 1726867234.18375: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867234.18458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867234.18469: variable 'omit' from source: magic vars 13131 1726867234.19157: variable 'ansible_distribution_major_version' from source: facts 13131 1726867234.19169: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867234.19176: variable 'omit' from source: magic vars 13131 1726867234.19354: variable 'omit' from source: magic vars 13131 1726867234.19733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13131 1726867234.21911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13131 1726867234.21982: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13131 1726867234.22018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13131 1726867234.22064: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13131 1726867234.22092: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13131 1726867234.22181: variable 'network_provider' from source: set_fact 13131 1726867234.22319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13131 1726867234.22360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13131 1726867234.22396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13131 1726867234.22435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13131 1726867234.22484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13131 1726867234.22702: variable 'omit' from source: magic vars 13131 1726867234.22710: variable 'omit' from source: magic vars 13131 1726867234.22759: variable 'network_connections' from source: task vars 13131 1726867234.22769: variable 'controller_profile' from source: play vars 13131 1726867234.22971: variable 'controller_profile' from source: play vars 13131 1726867234.23225: variable 'omit' from source: magic vars 13131 1726867234.23245: variable '__lsr_ansible_managed' from source: task vars 13131 1726867234.23403: variable '__lsr_ansible_managed' from source: task vars 13131 1726867234.23696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13131 1726867234.24286: Loaded config def from plugin (lookup/template) 13131 1726867234.24289: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13131 1726867234.24291: File lookup term: get_ansible_managed.j2 13131 1726867234.24293: variable 'ansible_search_path' from source: unknown 13131 1726867234.24295: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13131 1726867234.24299: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13131 1726867234.24391: variable 'ansible_search_path' from source: unknown 13131 1726867234.39838: variable 'ansible_managed' from source: unknown 13131 1726867234.40228: variable 'omit' from source: magic vars 13131 1726867234.40256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867234.40459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867234.40462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867234.40465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867234.40468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867234.40470: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867234.40472: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867234.40474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867234.40667: Set connection var ansible_connection to ssh 13131 1726867234.40675: Set connection var ansible_timeout to 10 13131 1726867234.40680: Set connection var ansible_shell_type to sh 13131 1726867234.40715: Set connection var ansible_shell_executable to /bin/sh 13131 1726867234.40720: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867234.40784: Set connection var ansible_pipelining to False 13131 1726867234.40801: variable 'ansible_shell_executable' from source: unknown 13131 1726867234.40982: variable 'ansible_connection' from source: unknown 13131 1726867234.40985: variable 'ansible_module_compression' from source: unknown 13131 1726867234.40988: variable 'ansible_shell_type' from source: unknown 13131 1726867234.40990: variable 'ansible_shell_executable' from source: unknown 13131 1726867234.40993: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867234.40997: variable 'ansible_pipelining' from source: unknown 13131 1726867234.40999: variable 'ansible_timeout' from source: unknown 13131 1726867234.41001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867234.41187: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867234.41199: variable 'omit' from source: magic vars 13131 1726867234.41201: starting attempt loop 13131 1726867234.41207: running the handler 13131 1726867234.41219: _low_level_execute_command(): starting 13131 1726867234.41227: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867234.42970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867234.42975: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867234.43682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867234.43686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867234.43719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867234.43881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867234.45545: stdout chunk (state=3): >>>/root <<< 13131 1726867234.45691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867234.45696: stdout chunk (state=3): >>><<< 13131 1726867234.45717: stderr chunk (state=3): >>><<< 13131 1726867234.45737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867234.45751: _low_level_execute_command(): starting 13131 1726867234.45758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788 `" && echo ansible-tmp-1726867234.4573815-15469-136760803978788="` echo /root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788 `" ) && sleep 0' 13131 1726867234.47088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867234.47102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867234.47109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867234.47214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867234.47275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867234.47354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867234.49526: stdout chunk (state=3): >>>ansible-tmp-1726867234.4573815-15469-136760803978788=/root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788 <<< 13131 1726867234.49642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867234.49645: stdout chunk (state=3): >>><<< 13131 1726867234.49647: stderr chunk (state=3): >>><<< 13131 1726867234.49883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867234.4573815-15469-136760803978788=/root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867234.49886: variable 'ansible_module_compression' from source: unknown 13131 1726867234.49889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13131 1726867234.49891: variable 'ansible_facts' from source: unknown 13131 1726867234.50484: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/AnsiballZ_network_connections.py 13131 1726867234.50587: Sending initial data 13131 1726867234.50590: Sent initial data (168 bytes) 13131 1726867234.51857: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867234.51869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867234.51888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867234.52194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867234.52209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867234.53788: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867234.53830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867234.53876: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpc1_p8xgi /root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/AnsiballZ_network_connections.py <<< 13131 1726867234.53891: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/AnsiballZ_network_connections.py" <<< 13131 1726867234.53983: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpc1_p8xgi" to remote "/root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/AnsiballZ_network_connections.py" <<< 13131 1726867234.55956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867234.55992: stderr chunk (state=3): >>><<< 13131 1726867234.55995: stdout chunk (state=3): >>><<< 13131 1726867234.56068: done transferring module to remote 13131 1726867234.56168: _low_level_execute_command(): starting 13131 1726867234.56174: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/ /root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/AnsiballZ_network_connections.py && sleep 0' 13131 1726867234.57059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867234.57090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867234.57171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867234.57234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867234.59053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867234.59202: stderr chunk (state=3): >>><<< 13131 1726867234.59209: stdout chunk (state=3): >>><<< 13131 1726867234.59226: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867234.59235: _low_level_execute_command(): starting 13131 1726867234.59316: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/AnsiballZ_network_connections.py && sleep 0' 13131 1726867234.60279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867234.60282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867234.60285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867234.60287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867234.60289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867234.60350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867234.60354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867234.60424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867234.98681: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wk4y5xhn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wk4y5xhn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/69e7ee46-007a-470e-9bdc-4928b4af57bb: error=unknown <<< 13131 1726867234.98854: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13131 1726867235.00718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867235.00739: stderr chunk (state=3): >>><<< 13131 1726867235.00753: stdout chunk (state=3): >>><<< 13131 1726867235.00815: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wk4y5xhn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wk4y5xhn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/69e7ee46-007a-470e-9bdc-4928b4af57bb: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867235.00830: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'down', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867235.00845: _low_level_execute_command(): starting 13131 1726867235.00857: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867234.4573815-15469-136760803978788/ > /dev/null 2>&1 && sleep 0' 13131 1726867235.01507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867235.01523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867235.01537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867235.01588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867235.01656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867235.01680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867235.01713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867235.01781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867235.03782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867235.03786: stderr chunk (state=3): >>><<< 13131 1726867235.03789: stdout chunk (state=3): >>><<< 13131 1726867235.03795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867235.03797: handler run complete 13131 1726867235.03799: attempt loop complete, returning result 13131 1726867235.03800: _execute() done 13131 1726867235.03802: dumping result to json 13131 1726867235.03803: done dumping result, returning 13131 1726867235.03805: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-5f24-9b7a-00000000017c] 13131 1726867235.03810: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017c 13131 1726867235.03873: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017c 13131 1726867235.03876: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13131 1726867235.04099: no more pending results, returning what we have 13131 1726867235.04102: results queue empty 13131 1726867235.04105: checking for any_errors_fatal 13131 1726867235.04113: done checking for any_errors_fatal 13131 1726867235.04114: checking for max_fail_percentage 13131 1726867235.04115: done checking for max_fail_percentage 13131 1726867235.04116: checking to see if all hosts have failed and the running result is not ok 13131 1726867235.04117: done checking to see if all hosts have failed 13131 1726867235.04118: getting the remaining hosts for this loop 13131 1726867235.04119: done getting the remaining hosts for this loop 13131 1726867235.04122: getting the next task for host managed_node1 13131 1726867235.04129: done getting next task for host managed_node1 13131 1726867235.04133: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867235.04136: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867235.04147: getting variables 13131 1726867235.04148: in VariableManager get_vars() 13131 1726867235.04317: Calling all_inventory to load vars for managed_node1 13131 1726867235.04320: Calling groups_inventory to load vars for managed_node1 13131 1726867235.04322: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867235.04332: Calling all_plugins_play to load vars for managed_node1 13131 1726867235.04335: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867235.04338: Calling groups_plugins_play to load vars for managed_node1 13131 1726867235.05955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867235.07512: done with get_vars() 13131 1726867235.07535: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:35 -0400 (0:00:00.906) 0:00:50.186 ****** 13131 1726867235.07627: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867235.07940: worker is 1 (out of 1 available) 13131 1726867235.07952: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13131 1726867235.07963: done queuing things up, now waiting for results queue to drain 13131 1726867235.07964: waiting for pending results... 13131 1726867235.08235: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13131 1726867235.08386: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000017d 13131 1726867235.08411: variable 'ansible_search_path' from source: unknown 13131 1726867235.08419: variable 'ansible_search_path' from source: unknown 13131 1726867235.08458: calling self._execute() 13131 1726867235.08558: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.08570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.08585: variable 'omit' from source: magic vars 13131 1726867235.08952: variable 'ansible_distribution_major_version' from source: facts 13131 1726867235.08966: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867235.09082: variable 'network_state' from source: role '' defaults 13131 1726867235.09097: Evaluated conditional (network_state != {}): False 13131 1726867235.09105: when evaluation is False, skipping this task 13131 1726867235.09112: _execute() done 13131 1726867235.09118: dumping result to json 13131 1726867235.09125: done dumping result, returning 13131 1726867235.09136: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-5f24-9b7a-00000000017d] 13131 1726867235.09145: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13131 1726867235.09435: no more pending results, returning what we have 13131 1726867235.09439: results queue empty 13131 1726867235.09441: checking for any_errors_fatal 13131 1726867235.09449: done checking for any_errors_fatal 13131 1726867235.09450: checking for max_fail_percentage 13131 1726867235.09452: done checking for max_fail_percentage 13131 1726867235.09453: checking to see if all hosts have failed and the running result is not ok 13131 1726867235.09454: done checking to see if all hosts have failed 13131 1726867235.09455: getting the remaining hosts for this loop 13131 1726867235.09456: done getting the remaining hosts for this loop 13131 1726867235.09460: getting the next task for host managed_node1 13131 1726867235.09467: done getting next task for host managed_node1 13131 1726867235.09471: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867235.09476: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867235.09655: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017d 13131 1726867235.09658: WORKER PROCESS EXITING 13131 1726867235.09673: getting variables 13131 1726867235.09675: in VariableManager get_vars() 13131 1726867235.09720: Calling all_inventory to load vars for managed_node1 13131 1726867235.09723: Calling groups_inventory to load vars for managed_node1 13131 1726867235.09726: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867235.09735: Calling all_plugins_play to load vars for managed_node1 13131 1726867235.09737: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867235.09740: Calling groups_plugins_play to load vars for managed_node1 13131 1726867235.11167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867235.12664: done with get_vars() 13131 1726867235.12687: done getting variables 13131 1726867235.12744: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:35 -0400 (0:00:00.051) 0:00:50.238 ****** 13131 1726867235.12780: entering _queue_task() for managed_node1/debug 13131 1726867235.13054: worker is 1 (out of 1 available) 13131 1726867235.13067: exiting _queue_task() for managed_node1/debug 13131 1726867235.13079: done queuing things up, now waiting for results queue to drain 13131 1726867235.13081: waiting for pending results... 13131 1726867235.13341: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13131 1726867235.13503: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000017e 13131 1726867235.13524: variable 'ansible_search_path' from source: unknown 13131 1726867235.13532: variable 'ansible_search_path' from source: unknown 13131 1726867235.13572: calling self._execute() 13131 1726867235.13665: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.13679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.13694: variable 'omit' from source: magic vars 13131 1726867235.14057: variable 'ansible_distribution_major_version' from source: facts 13131 1726867235.14074: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867235.14088: variable 'omit' from source: magic vars 13131 1726867235.14173: variable 'omit' from source: magic vars 13131 1726867235.14201: variable 'omit' from source: magic vars 13131 1726867235.14243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867235.14482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867235.14485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867235.14488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867235.14490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867235.14492: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867235.14494: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.14496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.14498: Set connection var ansible_connection to ssh 13131 1726867235.14500: Set connection var ansible_timeout to 10 13131 1726867235.14502: Set connection var ansible_shell_type to sh 13131 1726867235.14504: Set connection var ansible_shell_executable to /bin/sh 13131 1726867235.14518: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867235.14528: Set connection var ansible_pipelining to False 13131 1726867235.14552: variable 'ansible_shell_executable' from source: unknown 13131 1726867235.14559: variable 'ansible_connection' from source: unknown 13131 1726867235.14567: variable 'ansible_module_compression' from source: unknown 13131 1726867235.14574: variable 'ansible_shell_type' from source: unknown 13131 1726867235.14583: variable 'ansible_shell_executable' from source: unknown 13131 1726867235.14589: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.14596: variable 'ansible_pipelining' from source: unknown 13131 1726867235.14602: variable 'ansible_timeout' from source: unknown 13131 1726867235.14610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.14755: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867235.14773: variable 'omit' from source: magic vars 13131 1726867235.14785: starting attempt loop 13131 1726867235.14792: running the handler 13131 1726867235.14922: variable '__network_connections_result' from source: set_fact 13131 1726867235.14980: handler run complete 13131 1726867235.15002: attempt loop complete, returning result 13131 1726867235.15009: _execute() done 13131 1726867235.15016: dumping result to json 13131 1726867235.15023: done dumping result, returning 13131 1726867235.15035: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-5f24-9b7a-00000000017e] 13131 1726867235.15043: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017e ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 13131 1726867235.15226: no more pending results, returning what we have 13131 1726867235.15230: results queue empty 13131 1726867235.15231: checking for any_errors_fatal 13131 1726867235.15239: done checking for any_errors_fatal 13131 1726867235.15240: checking for max_fail_percentage 13131 1726867235.15242: done checking for max_fail_percentage 13131 1726867235.15243: checking to see if all hosts have failed and the running result is not ok 13131 1726867235.15243: done checking to see if all hosts have failed 13131 1726867235.15244: getting the remaining hosts for this loop 13131 1726867235.15246: done getting the remaining hosts for this loop 13131 1726867235.15249: getting the next task for host managed_node1 13131 1726867235.15255: done getting next task for host managed_node1 13131 1726867235.15259: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867235.15263: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867235.15276: getting variables 13131 1726867235.15279: in VariableManager get_vars() 13131 1726867235.15325: Calling all_inventory to load vars for managed_node1 13131 1726867235.15328: Calling groups_inventory to load vars for managed_node1 13131 1726867235.15331: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867235.15340: Calling all_plugins_play to load vars for managed_node1 13131 1726867235.15343: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867235.15346: Calling groups_plugins_play to load vars for managed_node1 13131 1726867235.15991: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017e 13131 1726867235.15994: WORKER PROCESS EXITING 13131 1726867235.16873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867235.18369: done with get_vars() 13131 1726867235.18392: done getting variables 13131 1726867235.18440: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:35 -0400 (0:00:00.056) 0:00:50.295 ****** 13131 1726867235.18472: entering _queue_task() for managed_node1/debug 13131 1726867235.18753: worker is 1 (out of 1 available) 13131 1726867235.18764: exiting _queue_task() for managed_node1/debug 13131 1726867235.18776: done queuing things up, now waiting for results queue to drain 13131 1726867235.18780: waiting for pending results... 13131 1726867235.19056: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13131 1726867235.19216: in run() - task 0affcac9-a3a5-5f24-9b7a-00000000017f 13131 1726867235.19237: variable 'ansible_search_path' from source: unknown 13131 1726867235.19246: variable 'ansible_search_path' from source: unknown 13131 1726867235.19287: calling self._execute() 13131 1726867235.19387: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.19399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.19415: variable 'omit' from source: magic vars 13131 1726867235.19796: variable 'ansible_distribution_major_version' from source: facts 13131 1726867235.19813: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867235.19826: variable 'omit' from source: magic vars 13131 1726867235.19900: variable 'omit' from source: magic vars 13131 1726867235.19937: variable 'omit' from source: magic vars 13131 1726867235.19980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867235.20017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867235.20039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867235.20060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867235.20080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867235.20114: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867235.20181: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.20184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.20232: Set connection var ansible_connection to ssh 13131 1726867235.20243: Set connection var ansible_timeout to 10 13131 1726867235.20249: Set connection var ansible_shell_type to sh 13131 1726867235.20259: Set connection var ansible_shell_executable to /bin/sh 13131 1726867235.20270: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867235.20279: Set connection var ansible_pipelining to False 13131 1726867235.20310: variable 'ansible_shell_executable' from source: unknown 13131 1726867235.20319: variable 'ansible_connection' from source: unknown 13131 1726867235.20382: variable 'ansible_module_compression' from source: unknown 13131 1726867235.20385: variable 'ansible_shell_type' from source: unknown 13131 1726867235.20387: variable 'ansible_shell_executable' from source: unknown 13131 1726867235.20389: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.20391: variable 'ansible_pipelining' from source: unknown 13131 1726867235.20394: variable 'ansible_timeout' from source: unknown 13131 1726867235.20395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.20498: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867235.20518: variable 'omit' from source: magic vars 13131 1726867235.20529: starting attempt loop 13131 1726867235.20535: running the handler 13131 1726867235.20588: variable '__network_connections_result' from source: set_fact 13131 1726867235.20669: variable '__network_connections_result' from source: set_fact 13131 1726867235.20782: handler run complete 13131 1726867235.20810: attempt loop complete, returning result 13131 1726867235.20841: _execute() done 13131 1726867235.20844: dumping result to json 13131 1726867235.20846: done dumping result, returning 13131 1726867235.20848: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-5f24-9b7a-00000000017f] 13131 1726867235.20852: sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017f ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13131 1726867235.21176: no more pending results, returning what we have 13131 1726867235.21182: results queue empty 13131 1726867235.21183: checking for any_errors_fatal 13131 1726867235.21189: done checking for any_errors_fatal 13131 1726867235.21190: checking for max_fail_percentage 13131 1726867235.21193: done checking for max_fail_percentage 13131 1726867235.21194: checking to see if all hosts have failed and the running result is not ok 13131 1726867235.21195: done checking to see if all hosts have failed 13131 1726867235.21195: getting the remaining hosts for this loop 13131 1726867235.21197: done getting the remaining hosts for this loop 13131 1726867235.21200: getting the next task for host managed_node1 13131 1726867235.21208: done getting next task for host managed_node1 13131 1726867235.21212: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867235.21217: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867235.21232: getting variables 13131 1726867235.21234: in VariableManager get_vars() 13131 1726867235.21470: Calling all_inventory to load vars for managed_node1 13131 1726867235.21473: Calling groups_inventory to load vars for managed_node1 13131 1726867235.21475: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867235.21484: done sending task result for task 0affcac9-a3a5-5f24-9b7a-00000000017f 13131 1726867235.21486: WORKER PROCESS EXITING 13131 1726867235.21495: Calling all_plugins_play to load vars for managed_node1 13131 1726867235.21498: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867235.21501: Calling groups_plugins_play to load vars for managed_node1 13131 1726867235.22960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867235.24474: done with get_vars() 13131 1726867235.24511: done getting variables 13131 1726867235.24573: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:35 -0400 (0:00:00.061) 0:00:50.356 ****** 13131 1726867235.24610: entering _queue_task() for managed_node1/debug 13131 1726867235.24974: worker is 1 (out of 1 available) 13131 1726867235.24990: exiting _queue_task() for managed_node1/debug 13131 1726867235.25002: done queuing things up, now waiting for results queue to drain 13131 1726867235.25004: waiting for pending results... 13131 1726867235.25292: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13131 1726867235.25452: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000180 13131 1726867235.25471: variable 'ansible_search_path' from source: unknown 13131 1726867235.25479: variable 'ansible_search_path' from source: unknown 13131 1726867235.25520: calling self._execute() 13131 1726867235.25622: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.25638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.25655: variable 'omit' from source: magic vars 13131 1726867235.26026: variable 'ansible_distribution_major_version' from source: facts 13131 1726867235.26043: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867235.26175: variable 'network_state' from source: role '' defaults 13131 1726867235.26191: Evaluated conditional (network_state != {}): False 13131 1726867235.26199: when evaluation is False, skipping this task 13131 1726867235.26205: _execute() done 13131 1726867235.26213: dumping result to json 13131 1726867235.26283: done dumping result, returning 13131 1726867235.26287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-5f24-9b7a-000000000180] 13131 1726867235.26290: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000180 13131 1726867235.26361: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000180 13131 1726867235.26365: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 13131 1726867235.26433: no more pending results, returning what we have 13131 1726867235.26437: results queue empty 13131 1726867235.26438: checking for any_errors_fatal 13131 1726867235.26452: done checking for any_errors_fatal 13131 1726867235.26453: checking for max_fail_percentage 13131 1726867235.26456: done checking for max_fail_percentage 13131 1726867235.26457: checking to see if all hosts have failed and the running result is not ok 13131 1726867235.26458: done checking to see if all hosts have failed 13131 1726867235.26458: getting the remaining hosts for this loop 13131 1726867235.26460: done getting the remaining hosts for this loop 13131 1726867235.26464: getting the next task for host managed_node1 13131 1726867235.26472: done getting next task for host managed_node1 13131 1726867235.26476: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867235.26482: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867235.26507: getting variables 13131 1726867235.26509: in VariableManager get_vars() 13131 1726867235.26567: Calling all_inventory to load vars for managed_node1 13131 1726867235.26570: Calling groups_inventory to load vars for managed_node1 13131 1726867235.26572: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867235.26864: Calling all_plugins_play to load vars for managed_node1 13131 1726867235.26868: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867235.26872: Calling groups_plugins_play to load vars for managed_node1 13131 1726867235.28190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867235.29754: done with get_vars() 13131 1726867235.29780: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:35 -0400 (0:00:00.052) 0:00:50.409 ****** 13131 1726867235.29872: entering _queue_task() for managed_node1/ping 13131 1726867235.30412: worker is 1 (out of 1 available) 13131 1726867235.30422: exiting _queue_task() for managed_node1/ping 13131 1726867235.30431: done queuing things up, now waiting for results queue to drain 13131 1726867235.30432: waiting for pending results... 13131 1726867235.30661: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13131 1726867235.30779: in run() - task 0affcac9-a3a5-5f24-9b7a-000000000181 13131 1726867235.30784: variable 'ansible_search_path' from source: unknown 13131 1726867235.30787: variable 'ansible_search_path' from source: unknown 13131 1726867235.30828: calling self._execute() 13131 1726867235.30993: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.30997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.31000: variable 'omit' from source: magic vars 13131 1726867235.31348: variable 'ansible_distribution_major_version' from source: facts 13131 1726867235.31365: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867235.31380: variable 'omit' from source: magic vars 13131 1726867235.31447: variable 'omit' from source: magic vars 13131 1726867235.31484: variable 'omit' from source: magic vars 13131 1726867235.31526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867235.31568: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867235.31594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867235.31645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867235.31648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867235.31671: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867235.31682: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.31690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.31863: Set connection var ansible_connection to ssh 13131 1726867235.31866: Set connection var ansible_timeout to 10 13131 1726867235.31868: Set connection var ansible_shell_type to sh 13131 1726867235.31870: Set connection var ansible_shell_executable to /bin/sh 13131 1726867235.31872: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867235.31874: Set connection var ansible_pipelining to False 13131 1726867235.31876: variable 'ansible_shell_executable' from source: unknown 13131 1726867235.31880: variable 'ansible_connection' from source: unknown 13131 1726867235.31882: variable 'ansible_module_compression' from source: unknown 13131 1726867235.31884: variable 'ansible_shell_type' from source: unknown 13131 1726867235.31886: variable 'ansible_shell_executable' from source: unknown 13131 1726867235.31888: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.31896: variable 'ansible_pipelining' from source: unknown 13131 1726867235.31903: variable 'ansible_timeout' from source: unknown 13131 1726867235.31913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.32121: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13131 1726867235.32138: variable 'omit' from source: magic vars 13131 1726867235.32148: starting attempt loop 13131 1726867235.32155: running the handler 13131 1726867235.32175: _low_level_execute_command(): starting 13131 1726867235.32282: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867235.33016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867235.33035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867235.33072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867235.33182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867235.33199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867235.33296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867235.34949: stdout chunk (state=3): >>>/root <<< 13131 1726867235.35083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867235.35132: stderr chunk (state=3): >>><<< 13131 1726867235.35141: stdout chunk (state=3): >>><<< 13131 1726867235.35174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867235.35199: _low_level_execute_command(): starting 13131 1726867235.35219: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247 `" && echo ansible-tmp-1726867235.3518267-15527-211263280431247="` echo /root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247 `" ) && sleep 0' 13131 1726867235.35879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867235.35895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867235.35912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867235.35962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867235.35986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867235.36144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867235.36198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867235.36354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867235.38254: stdout chunk (state=3): >>>ansible-tmp-1726867235.3518267-15527-211263280431247=/root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247 <<< 13131 1726867235.38419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867235.38422: stdout chunk (state=3): >>><<< 13131 1726867235.38425: stderr chunk (state=3): >>><<< 13131 1726867235.38483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867235.3518267-15527-211263280431247=/root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867235.38649: variable 'ansible_module_compression' from source: unknown 13131 1726867235.38654: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13131 1726867235.38656: variable 'ansible_facts' from source: unknown 13131 1726867235.38736: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/AnsiballZ_ping.py 13131 1726867235.39071: Sending initial data 13131 1726867235.39163: Sent initial data (153 bytes) 13131 1726867235.39783: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867235.39964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867235.40016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867235.41608: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867235.41657: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867235.41679: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp_i2i9ijm /root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/AnsiballZ_ping.py <<< 13131 1726867235.41684: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/AnsiballZ_ping.py" <<< 13131 1726867235.41733: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp_i2i9ijm" to remote "/root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/AnsiballZ_ping.py" <<< 13131 1726867235.43127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867235.43131: stdout chunk (state=3): >>><<< 13131 1726867235.43139: stderr chunk (state=3): >>><<< 13131 1726867235.43389: done transferring module to remote 13131 1726867235.43399: _low_level_execute_command(): starting 13131 1726867235.43405: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/ /root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/AnsiballZ_ping.py && sleep 0' 13131 1726867235.44473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867235.44675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867235.44693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867235.44704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867235.44780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867235.46652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867235.46655: stdout chunk (state=3): >>><<< 13131 1726867235.46662: stderr chunk (state=3): >>><<< 13131 1726867235.46681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867235.46889: _low_level_execute_command(): starting 13131 1726867235.46894: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/AnsiballZ_ping.py && sleep 0' 13131 1726867235.48054: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867235.48391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867235.48414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867235.48483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867235.63198: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13131 1726867235.64471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867235.64475: stderr chunk (state=3): >>><<< 13131 1726867235.64480: stdout chunk (state=3): >>><<< 13131 1726867235.64512: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867235.64521: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867235.64531: _low_level_execute_command(): starting 13131 1726867235.64536: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867235.3518267-15527-211263280431247/ > /dev/null 2>&1 && sleep 0' 13131 1726867235.65667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867235.65674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867235.65697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867235.65713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867235.65875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867235.65885: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867235.65894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867235.65928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867235.65932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867235.65935: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867235.65937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867235.65940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867235.66021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867235.66042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867235.66108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867235.67941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867235.67951: stdout chunk (state=3): >>><<< 13131 1726867235.67963: stderr chunk (state=3): >>><<< 13131 1726867235.67986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867235.67998: handler run complete 13131 1726867235.68017: attempt loop complete, returning result 13131 1726867235.68025: _execute() done 13131 1726867235.68032: dumping result to json 13131 1726867235.68040: done dumping result, returning 13131 1726867235.68054: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-5f24-9b7a-000000000181] 13131 1726867235.68067: sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000181 ok: [managed_node1] => { "changed": false, "ping": "pong" } 13131 1726867235.68243: no more pending results, returning what we have 13131 1726867235.68246: results queue empty 13131 1726867235.68384: checking for any_errors_fatal 13131 1726867235.68390: done checking for any_errors_fatal 13131 1726867235.68391: checking for max_fail_percentage 13131 1726867235.68393: done checking for max_fail_percentage 13131 1726867235.68394: checking to see if all hosts have failed and the running result is not ok 13131 1726867235.68395: done checking to see if all hosts have failed 13131 1726867235.68395: getting the remaining hosts for this loop 13131 1726867235.68397: done getting the remaining hosts for this loop 13131 1726867235.68400: getting the next task for host managed_node1 13131 1726867235.68411: done getting next task for host managed_node1 13131 1726867235.68413: ^ task is: TASK: meta (role_complete) 13131 1726867235.68417: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867235.68434: getting variables 13131 1726867235.68435: in VariableManager get_vars() 13131 1726867235.68539: Calling all_inventory to load vars for managed_node1 13131 1726867235.68542: Calling groups_inventory to load vars for managed_node1 13131 1726867235.68545: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867235.68551: done sending task result for task 0affcac9-a3a5-5f24-9b7a-000000000181 13131 1726867235.68554: WORKER PROCESS EXITING 13131 1726867235.68564: Calling all_plugins_play to load vars for managed_node1 13131 1726867235.68567: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867235.68570: Calling groups_plugins_play to load vars for managed_node1 13131 1726867235.71614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867235.75658: done with get_vars() 13131 1726867235.75894: done getting variables 13131 1726867235.75986: done queuing things up, now waiting for results queue to drain 13131 1726867235.75988: results queue empty 13131 1726867235.75989: checking for any_errors_fatal 13131 1726867235.75993: done checking for any_errors_fatal 13131 1726867235.75993: checking for max_fail_percentage 13131 1726867235.75994: done checking for max_fail_percentage 13131 1726867235.75995: checking to see if all hosts have failed and the running result is not ok 13131 1726867235.75996: done checking to see if all hosts have failed 13131 1726867235.75996: getting the remaining hosts for this loop 13131 1726867235.75997: done getting the remaining hosts for this loop 13131 1726867235.76000: getting the next task for host managed_node1 13131 1726867235.76004: done getting next task for host managed_node1 13131 1726867235.76006: ^ task is: TASK: Delete the device '{{ controller_device }}' 13131 1726867235.76008: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867235.76011: getting variables 13131 1726867235.76012: in VariableManager get_vars() 13131 1726867235.76032: Calling all_inventory to load vars for managed_node1 13131 1726867235.76034: Calling groups_inventory to load vars for managed_node1 13131 1726867235.76036: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867235.76040: Calling all_plugins_play to load vars for managed_node1 13131 1726867235.76042: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867235.76045: Calling groups_plugins_play to load vars for managed_node1 13131 1726867235.77372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867235.90110: done with get_vars() 13131 1726867235.90139: done getting variables 13131 1726867235.90322: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13131 1726867235.90526: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Friday 20 September 2024 17:20:35 -0400 (0:00:00.606) 0:00:51.016 ****** 13131 1726867235.90553: entering _queue_task() for managed_node1/command 13131 1726867235.91462: worker is 1 (out of 1 available) 13131 1726867235.91475: exiting _queue_task() for managed_node1/command 13131 1726867235.91489: done queuing things up, now waiting for results queue to drain 13131 1726867235.91490: waiting for pending results... 13131 1726867235.91817: running TaskExecutor() for managed_node1/TASK: Delete the device 'nm-bond' 13131 1726867235.91951: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001b1 13131 1726867235.92185: variable 'ansible_search_path' from source: unknown 13131 1726867235.92190: calling self._execute() 13131 1726867235.92193: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.92195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.92198: variable 'omit' from source: magic vars 13131 1726867235.92549: variable 'ansible_distribution_major_version' from source: facts 13131 1726867235.92858: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867235.92863: variable 'omit' from source: magic vars 13131 1726867235.92866: variable 'omit' from source: magic vars 13131 1726867235.92995: variable 'controller_device' from source: play vars 13131 1726867235.93020: variable 'omit' from source: magic vars 13131 1726867235.93066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867235.93221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867235.93250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867235.93273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867235.93306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867235.93420: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867235.93430: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.93438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.93884: Set connection var ansible_connection to ssh 13131 1726867235.93887: Set connection var ansible_timeout to 10 13131 1726867235.93889: Set connection var ansible_shell_type to sh 13131 1726867235.93891: Set connection var ansible_shell_executable to /bin/sh 13131 1726867235.93893: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867235.93895: Set connection var ansible_pipelining to False 13131 1726867235.93897: variable 'ansible_shell_executable' from source: unknown 13131 1726867235.93899: variable 'ansible_connection' from source: unknown 13131 1726867235.93901: variable 'ansible_module_compression' from source: unknown 13131 1726867235.93903: variable 'ansible_shell_type' from source: unknown 13131 1726867235.93904: variable 'ansible_shell_executable' from source: unknown 13131 1726867235.93906: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867235.93908: variable 'ansible_pipelining' from source: unknown 13131 1726867235.93910: variable 'ansible_timeout' from source: unknown 13131 1726867235.93911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867235.94054: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867235.94222: variable 'omit' from source: magic vars 13131 1726867235.94233: starting attempt loop 13131 1726867235.94239: running the handler 13131 1726867235.94296: _low_level_execute_command(): starting 13131 1726867235.94299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867235.96283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867235.96386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867235.96399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867235.96739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867235.96792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867235.96822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867235.96968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867235.98596: stdout chunk (state=3): >>>/root <<< 13131 1726867235.98701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867235.98735: stderr chunk (state=3): >>><<< 13131 1726867235.98739: stdout chunk (state=3): >>><<< 13131 1726867235.98784: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867235.98787: _low_level_execute_command(): starting 13131 1726867235.98790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804 `" && echo ansible-tmp-1726867235.9876587-15577-208399514663804="` echo /root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804 `" ) && sleep 0' 13131 1726867235.99392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867235.99397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867235.99400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867235.99403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867235.99405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867235.99416: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867235.99446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867235.99449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867235.99982: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867235.99985: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867235.99988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867235.99990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867235.99992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867235.99995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.00317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.00682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.02264: stdout chunk (state=3): >>>ansible-tmp-1726867235.9876587-15577-208399514663804=/root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804 <<< 13131 1726867236.02399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.02410: stderr chunk (state=3): >>><<< 13131 1726867236.02413: stdout chunk (state=3): >>><<< 13131 1726867236.02434: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867235.9876587-15577-208399514663804=/root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.02473: variable 'ansible_module_compression' from source: unknown 13131 1726867236.02523: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867236.02559: variable 'ansible_facts' from source: unknown 13131 1726867236.02784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/AnsiballZ_command.py 13131 1726867236.03001: Sending initial data 13131 1726867236.03093: Sent initial data (156 bytes) 13131 1726867236.04333: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867236.04338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.04391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.04552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.04572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.04647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.06257: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867236.06313: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp8eb_3mvl /root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/AnsiballZ_command.py <<< 13131 1726867236.06317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/AnsiballZ_command.py" <<< 13131 1726867236.06424: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp8eb_3mvl" to remote "/root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/AnsiballZ_command.py" <<< 13131 1726867236.07975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.07980: stderr chunk (state=3): >>><<< 13131 1726867236.07982: stdout chunk (state=3): >>><<< 13131 1726867236.07991: done transferring module to remote 13131 1726867236.08005: _low_level_execute_command(): starting 13131 1726867236.08014: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/ /root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/AnsiballZ_command.py && sleep 0' 13131 1726867236.09328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867236.09395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.09568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.09599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.09673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.11588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.11601: stdout chunk (state=3): >>><<< 13131 1726867236.11784: stderr chunk (state=3): >>><<< 13131 1726867236.11788: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.11790: _low_level_execute_command(): starting 13131 1726867236.11792: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/AnsiballZ_command.py && sleep 0' 13131 1726867236.12808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867236.12939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.12999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.13017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.13048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.13419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.29136: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 17:20:36.281416", "end": "2024-09-20 17:20:36.288629", "delta": "0:00:00.007213", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867236.30495: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.12.57 closed. <<< 13131 1726867236.30523: stdout chunk (state=3): >>><<< 13131 1726867236.30528: stderr chunk (state=3): >>><<< 13131 1726867236.30583: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 17:20:36.281416", "end": "2024-09-20 17:20:36.288629", "delta": "0:00:00.007213", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.12.57 closed. 13131 1726867236.30604: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867236.30618: _low_level_execute_command(): starting 13131 1726867236.30634: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867235.9876587-15577-208399514663804/ > /dev/null 2>&1 && sleep 0' 13131 1726867236.31429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867236.31509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867236.31514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867236.31516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867236.31518: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867236.31523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.31544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.31557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.31625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.33465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.33468: stdout chunk (state=3): >>><<< 13131 1726867236.33682: stderr chunk (state=3): >>><<< 13131 1726867236.33686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.33688: handler run complete 13131 1726867236.33689: Evaluated conditional (False): False 13131 1726867236.33691: Evaluated conditional (False): False 13131 1726867236.33693: attempt loop complete, returning result 13131 1726867236.33694: _execute() done 13131 1726867236.33696: dumping result to json 13131 1726867236.33698: done dumping result, returning 13131 1726867236.33700: done running TaskExecutor() for managed_node1/TASK: Delete the device 'nm-bond' [0affcac9-a3a5-5f24-9b7a-0000000001b1] 13131 1726867236.33703: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b1 13131 1726867236.33769: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b1 13131 1726867236.33772: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007213", "end": "2024-09-20 17:20:36.288629", "failed_when_result": false, "rc": 1, "start": "2024-09-20 17:20:36.281416" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 13131 1726867236.33858: no more pending results, returning what we have 13131 1726867236.33862: results queue empty 13131 1726867236.33863: checking for any_errors_fatal 13131 1726867236.33865: done checking for any_errors_fatal 13131 1726867236.33866: checking for max_fail_percentage 13131 1726867236.33868: done checking for max_fail_percentage 13131 1726867236.33869: checking to see if all hosts have failed and the running result is not ok 13131 1726867236.33869: done checking to see if all hosts have failed 13131 1726867236.33870: getting the remaining hosts for this loop 13131 1726867236.33872: done getting the remaining hosts for this loop 13131 1726867236.33875: getting the next task for host managed_node1 13131 1726867236.33886: done getting next task for host managed_node1 13131 1726867236.33889: ^ task is: TASK: Remove test interfaces 13131 1726867236.33893: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867236.33898: getting variables 13131 1726867236.33899: in VariableManager get_vars() 13131 1726867236.33958: Calling all_inventory to load vars for managed_node1 13131 1726867236.33961: Calling groups_inventory to load vars for managed_node1 13131 1726867236.33964: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867236.33975: Calling all_plugins_play to load vars for managed_node1 13131 1726867236.34199: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867236.34206: Calling groups_plugins_play to load vars for managed_node1 13131 1726867236.35874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867236.37693: done with get_vars() 13131 1726867236.37717: done getting variables 13131 1726867236.37775: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 17:20:36 -0400 (0:00:00.472) 0:00:51.488 ****** 13131 1726867236.37815: entering _queue_task() for managed_node1/shell 13131 1726867236.38144: worker is 1 (out of 1 available) 13131 1726867236.38155: exiting _queue_task() for managed_node1/shell 13131 1726867236.38166: done queuing things up, now waiting for results queue to drain 13131 1726867236.38167: waiting for pending results... 13131 1726867236.38474: running TaskExecutor() for managed_node1/TASK: Remove test interfaces 13131 1726867236.38628: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001b5 13131 1726867236.38644: variable 'ansible_search_path' from source: unknown 13131 1726867236.38647: variable 'ansible_search_path' from source: unknown 13131 1726867236.38688: calling self._execute() 13131 1726867236.38804: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867236.38814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867236.38826: variable 'omit' from source: magic vars 13131 1726867236.39245: variable 'ansible_distribution_major_version' from source: facts 13131 1726867236.39252: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867236.39259: variable 'omit' from source: magic vars 13131 1726867236.39331: variable 'omit' from source: magic vars 13131 1726867236.39683: variable 'dhcp_interface1' from source: play vars 13131 1726867236.39687: variable 'dhcp_interface2' from source: play vars 13131 1726867236.39689: variable 'omit' from source: magic vars 13131 1726867236.39691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867236.39694: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867236.39697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867236.39699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867236.39702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867236.39704: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867236.39706: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867236.39708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867236.39800: Set connection var ansible_connection to ssh 13131 1726867236.39804: Set connection var ansible_timeout to 10 13131 1726867236.39810: Set connection var ansible_shell_type to sh 13131 1726867236.39819: Set connection var ansible_shell_executable to /bin/sh 13131 1726867236.39831: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867236.39834: Set connection var ansible_pipelining to False 13131 1726867236.39864: variable 'ansible_shell_executable' from source: unknown 13131 1726867236.39867: variable 'ansible_connection' from source: unknown 13131 1726867236.39870: variable 'ansible_module_compression' from source: unknown 13131 1726867236.39872: variable 'ansible_shell_type' from source: unknown 13131 1726867236.39875: variable 'ansible_shell_executable' from source: unknown 13131 1726867236.39879: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867236.39882: variable 'ansible_pipelining' from source: unknown 13131 1726867236.39884: variable 'ansible_timeout' from source: unknown 13131 1726867236.39887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867236.40187: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867236.40191: variable 'omit' from source: magic vars 13131 1726867236.40193: starting attempt loop 13131 1726867236.40195: running the handler 13131 1726867236.40198: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867236.40201: _low_level_execute_command(): starting 13131 1726867236.40203: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867236.40928: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.40958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.41034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.42746: stdout chunk (state=3): >>>/root <<< 13131 1726867236.43099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.43105: stdout chunk (state=3): >>><<< 13131 1726867236.43118: stderr chunk (state=3): >>><<< 13131 1726867236.43139: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.43154: _low_level_execute_command(): starting 13131 1726867236.43160: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333 `" && echo ansible-tmp-1726867236.431397-15604-88758091343333="` echo /root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333 `" ) && sleep 0' 13131 1726867236.43928: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867236.43944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867236.43958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867236.43992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867236.44011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867236.44024: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867236.44039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.44093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.44142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.44165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.44252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.44268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.46125: stdout chunk (state=3): >>>ansible-tmp-1726867236.431397-15604-88758091343333=/root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333 <<< 13131 1726867236.46248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.46272: stderr chunk (state=3): >>><<< 13131 1726867236.46275: stdout chunk (state=3): >>><<< 13131 1726867236.46298: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867236.431397-15604-88758091343333=/root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.46341: variable 'ansible_module_compression' from source: unknown 13131 1726867236.46470: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867236.46473: variable 'ansible_facts' from source: unknown 13131 1726867236.46528: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/AnsiballZ_command.py 13131 1726867236.46733: Sending initial data 13131 1726867236.46744: Sent initial data (154 bytes) 13131 1726867236.47334: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.47599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.47616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.47640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.47818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.49250: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13131 1726867236.49272: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867236.49340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867236.49413: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp3m36a5ck /root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/AnsiballZ_command.py <<< 13131 1726867236.49430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/AnsiballZ_command.py" <<< 13131 1726867236.49467: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmp3m36a5ck" to remote "/root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/AnsiballZ_command.py" <<< 13131 1726867236.50519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.50522: stdout chunk (state=3): >>><<< 13131 1726867236.50525: stderr chunk (state=3): >>><<< 13131 1726867236.50527: done transferring module to remote 13131 1726867236.50529: _low_level_execute_command(): starting 13131 1726867236.50531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/ /root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/AnsiballZ_command.py && sleep 0' 13131 1726867236.51880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867236.51912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867236.51938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867236.51954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867236.52092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.52169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.52238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.52264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.52366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.54188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.54200: stdout chunk (state=3): >>><<< 13131 1726867236.54220: stderr chunk (state=3): >>><<< 13131 1726867236.54235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.54242: _low_level_execute_command(): starting 13131 1726867236.54392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/AnsiballZ_command.py && sleep 0' 13131 1726867236.55494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867236.55667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867236.55718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.55806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.55898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.75138: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 17:20:36.708423", "end": "2024-09-20 17:20:36.747138", "delta": "0:00:00.038715", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867236.76561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.76567: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 13131 1726867236.76619: stderr chunk (state=3): >>><<< 13131 1726867236.76633: stdout chunk (state=3): >>><<< 13131 1726867236.76662: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 17:20:36.708423", "end": "2024-09-20 17:20:36.747138", "delta": "0:00:00.038715", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867236.76705: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867236.76715: _low_level_execute_command(): starting 13131 1726867236.76721: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867236.431397-15604-88758091343333/ > /dev/null 2>&1 && sleep 0' 13131 1726867236.77579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867236.77628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867236.77898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867236.77902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867236.77904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867236.77906: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867236.77908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.77910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867236.77912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867236.77914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867236.77916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867236.77918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.77920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.77922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.77924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.77989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.79834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.79863: stderr chunk (state=3): >>><<< 13131 1726867236.79866: stdout chunk (state=3): >>><<< 13131 1726867236.79902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.79911: handler run complete 13131 1726867236.79940: Evaluated conditional (False): False 13131 1726867236.79950: attempt loop complete, returning result 13131 1726867236.79953: _execute() done 13131 1726867236.79955: dumping result to json 13131 1726867236.79960: done dumping result, returning 13131 1726867236.79969: done running TaskExecutor() for managed_node1/TASK: Remove test interfaces [0affcac9-a3a5-5f24-9b7a-0000000001b5] 13131 1726867236.79972: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b5 13131 1726867236.80087: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b5 13131 1726867236.80090: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.038715", "end": "2024-09-20 17:20:36.747138", "rc": 0, "start": "2024-09-20 17:20:36.708423" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 13131 1726867236.80152: no more pending results, returning what we have 13131 1726867236.80156: results queue empty 13131 1726867236.80157: checking for any_errors_fatal 13131 1726867236.80166: done checking for any_errors_fatal 13131 1726867236.80167: checking for max_fail_percentage 13131 1726867236.80169: done checking for max_fail_percentage 13131 1726867236.80169: checking to see if all hosts have failed and the running result is not ok 13131 1726867236.80170: done checking to see if all hosts have failed 13131 1726867236.80171: getting the remaining hosts for this loop 13131 1726867236.80172: done getting the remaining hosts for this loop 13131 1726867236.80175: getting the next task for host managed_node1 13131 1726867236.80184: done getting next task for host managed_node1 13131 1726867236.80186: ^ task is: TASK: Stop dnsmasq/radvd services 13131 1726867236.80190: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867236.80194: getting variables 13131 1726867236.80196: in VariableManager get_vars() 13131 1726867236.80249: Calling all_inventory to load vars for managed_node1 13131 1726867236.80252: Calling groups_inventory to load vars for managed_node1 13131 1726867236.80254: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867236.80264: Calling all_plugins_play to load vars for managed_node1 13131 1726867236.80267: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867236.80269: Calling groups_plugins_play to load vars for managed_node1 13131 1726867236.82630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867236.85384: done with get_vars() 13131 1726867236.85419: done getting variables 13131 1726867236.85484: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 17:20:36 -0400 (0:00:00.477) 0:00:51.965 ****** 13131 1726867236.85531: entering _queue_task() for managed_node1/shell 13131 1726867236.86110: worker is 1 (out of 1 available) 13131 1726867236.86121: exiting _queue_task() for managed_node1/shell 13131 1726867236.86130: done queuing things up, now waiting for results queue to drain 13131 1726867236.86132: waiting for pending results... 13131 1726867236.86371: running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services 13131 1726867236.86392: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001b6 13131 1726867236.86413: variable 'ansible_search_path' from source: unknown 13131 1726867236.86421: variable 'ansible_search_path' from source: unknown 13131 1726867236.86472: calling self._execute() 13131 1726867236.86634: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867236.86647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867236.86662: variable 'omit' from source: magic vars 13131 1726867236.87357: variable 'ansible_distribution_major_version' from source: facts 13131 1726867236.87557: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867236.87560: variable 'omit' from source: magic vars 13131 1726867236.87563: variable 'omit' from source: magic vars 13131 1726867236.87774: variable 'omit' from source: magic vars 13131 1726867236.87779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867236.87782: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867236.87784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867236.87897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867236.87913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867236.87943: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867236.87949: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867236.87956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867236.88081: Set connection var ansible_connection to ssh 13131 1726867236.88160: Set connection var ansible_timeout to 10 13131 1726867236.88191: Set connection var ansible_shell_type to sh 13131 1726867236.88217: Set connection var ansible_shell_executable to /bin/sh 13131 1726867236.88231: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867236.88238: Set connection var ansible_pipelining to False 13131 1726867236.88259: variable 'ansible_shell_executable' from source: unknown 13131 1726867236.88265: variable 'ansible_connection' from source: unknown 13131 1726867236.88271: variable 'ansible_module_compression' from source: unknown 13131 1726867236.88276: variable 'ansible_shell_type' from source: unknown 13131 1726867236.88283: variable 'ansible_shell_executable' from source: unknown 13131 1726867236.88289: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867236.88296: variable 'ansible_pipelining' from source: unknown 13131 1726867236.88301: variable 'ansible_timeout' from source: unknown 13131 1726867236.88307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867236.88466: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867236.88492: variable 'omit' from source: magic vars 13131 1726867236.88502: starting attempt loop 13131 1726867236.88509: running the handler 13131 1726867236.88523: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867236.88553: _low_level_execute_command(): starting 13131 1726867236.88566: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867236.89313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.89369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.89407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.89442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.89631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.91200: stdout chunk (state=3): >>>/root <<< 13131 1726867236.91301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.91356: stderr chunk (state=3): >>><<< 13131 1726867236.91367: stdout chunk (state=3): >>><<< 13131 1726867236.91398: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.91423: _low_level_execute_command(): starting 13131 1726867236.91434: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811 `" && echo ansible-tmp-1726867236.9140928-15650-274364634863811="` echo /root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811 `" ) && sleep 0' 13131 1726867236.91979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867236.91994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867236.92009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867236.92036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867236.92064: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867236.92078: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867236.92093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.92145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.92194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.92212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.92258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.92305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.94179: stdout chunk (state=3): >>>ansible-tmp-1726867236.9140928-15650-274364634863811=/root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811 <<< 13131 1726867236.94341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.94344: stdout chunk (state=3): >>><<< 13131 1726867236.94346: stderr chunk (state=3): >>><<< 13131 1726867236.94448: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867236.9140928-15650-274364634863811=/root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867236.94453: variable 'ansible_module_compression' from source: unknown 13131 1726867236.94465: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867236.94510: variable 'ansible_facts' from source: unknown 13131 1726867236.94605: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/AnsiballZ_command.py 13131 1726867236.94807: Sending initial data 13131 1726867236.94810: Sent initial data (156 bytes) 13131 1726867236.95430: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.95475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.95511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867236.97092: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867236.97139: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867236.97190: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpu7_bds1m /root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/AnsiballZ_command.py <<< 13131 1726867236.97193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/AnsiballZ_command.py" <<< 13131 1726867236.97235: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpu7_bds1m" to remote "/root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/AnsiballZ_command.py" <<< 13131 1726867236.97972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867236.98056: stderr chunk (state=3): >>><<< 13131 1726867236.98060: stdout chunk (state=3): >>><<< 13131 1726867236.98070: done transferring module to remote 13131 1726867236.98086: _low_level_execute_command(): starting 13131 1726867236.98091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/ /root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/AnsiballZ_command.py && sleep 0' 13131 1726867236.98675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867236.98691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867236.98707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867236.98728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867236.98764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867236.98767: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867236.98770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.98773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867236.98782: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867236.98819: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867236.98837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867236.98928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867236.98934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867236.98940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867236.98979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.00721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.00739: stderr chunk (state=3): >>><<< 13131 1726867237.00742: stdout chunk (state=3): >>><<< 13131 1726867237.00757: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.00760: _low_level_execute_command(): starting 13131 1726867237.00765: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/AnsiballZ_command.py && sleep 0' 13131 1726867237.01184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.01189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.01191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 13131 1726867237.01194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867237.01196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.01256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.01260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.01314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.19207: stdout chunk (state=3): >>> <<< 13131 1726867237.19212: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 17:20:37.162179", "end": "2024-09-20 17:20:37.189092", "delta": "0:00:00.026913", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867237.20754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867237.20781: stderr chunk (state=3): >>><<< 13131 1726867237.20785: stdout chunk (state=3): >>><<< 13131 1726867237.20801: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 17:20:37.162179", "end": "2024-09-20 17:20:37.189092", "delta": "0:00:00.026913", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867237.20834: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867237.20841: _low_level_execute_command(): starting 13131 1726867237.20845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867236.9140928-15650-274364634863811/ > /dev/null 2>&1 && sleep 0' 13131 1726867237.21261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867237.21265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867237.21302: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.21362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.21369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.21440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.23253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.23272: stderr chunk (state=3): >>><<< 13131 1726867237.23278: stdout chunk (state=3): >>><<< 13131 1726867237.23298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.23303: handler run complete 13131 1726867237.23321: Evaluated conditional (False): False 13131 1726867237.23329: attempt loop complete, returning result 13131 1726867237.23331: _execute() done 13131 1726867237.23334: dumping result to json 13131 1726867237.23339: done dumping result, returning 13131 1726867237.23346: done running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services [0affcac9-a3a5-5f24-9b7a-0000000001b6] 13131 1726867237.23348: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b6 13131 1726867237.23447: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b6 13131 1726867237.23450: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026913", "end": "2024-09-20 17:20:37.189092", "rc": 0, "start": "2024-09-20 17:20:37.162179" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 13131 1726867237.23517: no more pending results, returning what we have 13131 1726867237.23521: results queue empty 13131 1726867237.23521: checking for any_errors_fatal 13131 1726867237.23530: done checking for any_errors_fatal 13131 1726867237.23531: checking for max_fail_percentage 13131 1726867237.23532: done checking for max_fail_percentage 13131 1726867237.23533: checking to see if all hosts have failed and the running result is not ok 13131 1726867237.23534: done checking to see if all hosts have failed 13131 1726867237.23534: getting the remaining hosts for this loop 13131 1726867237.23536: done getting the remaining hosts for this loop 13131 1726867237.23539: getting the next task for host managed_node1 13131 1726867237.23547: done getting next task for host managed_node1 13131 1726867237.23549: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 13131 1726867237.23552: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867237.23556: getting variables 13131 1726867237.23557: in VariableManager get_vars() 13131 1726867237.23610: Calling all_inventory to load vars for managed_node1 13131 1726867237.23613: Calling groups_inventory to load vars for managed_node1 13131 1726867237.23615: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867237.23624: Calling all_plugins_play to load vars for managed_node1 13131 1726867237.23626: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867237.23629: Calling groups_plugins_play to load vars for managed_node1 13131 1726867237.24429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867237.25676: done with get_vars() 13131 1726867237.25699: done getting variables 13131 1726867237.25760: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Friday 20 September 2024 17:20:37 -0400 (0:00:00.402) 0:00:52.368 ****** 13131 1726867237.25797: entering _queue_task() for managed_node1/command 13131 1726867237.26010: worker is 1 (out of 1 available) 13131 1726867237.26024: exiting _queue_task() for managed_node1/command 13131 1726867237.26035: done queuing things up, now waiting for results queue to drain 13131 1726867237.26036: waiting for pending results... 13131 1726867237.26215: running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript 13131 1726867237.26296: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001b7 13131 1726867237.26307: variable 'ansible_search_path' from source: unknown 13131 1726867237.26340: calling self._execute() 13131 1726867237.26419: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867237.26423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867237.26432: variable 'omit' from source: magic vars 13131 1726867237.26896: variable 'ansible_distribution_major_version' from source: facts 13131 1726867237.26899: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867237.26961: variable 'network_provider' from source: set_fact 13131 1726867237.26972: Evaluated conditional (network_provider == "initscripts"): False 13131 1726867237.26980: when evaluation is False, skipping this task 13131 1726867237.26987: _execute() done 13131 1726867237.26994: dumping result to json 13131 1726867237.27010: done dumping result, returning 13131 1726867237.27020: done running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript [0affcac9-a3a5-5f24-9b7a-0000000001b7] 13131 1726867237.27029: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b7 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13131 1726867237.27244: no more pending results, returning what we have 13131 1726867237.27248: results queue empty 13131 1726867237.27250: checking for any_errors_fatal 13131 1726867237.27261: done checking for any_errors_fatal 13131 1726867237.27262: checking for max_fail_percentage 13131 1726867237.27264: done checking for max_fail_percentage 13131 1726867237.27265: checking to see if all hosts have failed and the running result is not ok 13131 1726867237.27265: done checking to see if all hosts have failed 13131 1726867237.27266: getting the remaining hosts for this loop 13131 1726867237.27267: done getting the remaining hosts for this loop 13131 1726867237.27271: getting the next task for host managed_node1 13131 1726867237.27281: done getting next task for host managed_node1 13131 1726867237.27284: ^ task is: TASK: Verify network state restored to default 13131 1726867237.27287: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867237.27292: getting variables 13131 1726867237.27293: in VariableManager get_vars() 13131 1726867237.27356: Calling all_inventory to load vars for managed_node1 13131 1726867237.27359: Calling groups_inventory to load vars for managed_node1 13131 1726867237.27361: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867237.27372: Calling all_plugins_play to load vars for managed_node1 13131 1726867237.27375: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867237.27384: Calling groups_plugins_play to load vars for managed_node1 13131 1726867237.27394: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b7 13131 1726867237.27397: WORKER PROCESS EXITING 13131 1726867237.28294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867237.29147: done with get_vars() 13131 1726867237.29161: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Friday 20 September 2024 17:20:37 -0400 (0:00:00.034) 0:00:52.402 ****** 13131 1726867237.29224: entering _queue_task() for managed_node1/include_tasks 13131 1726867237.29604: worker is 1 (out of 1 available) 13131 1726867237.29616: exiting _queue_task() for managed_node1/include_tasks 13131 1726867237.29626: done queuing things up, now waiting for results queue to drain 13131 1726867237.29627: waiting for pending results... 13131 1726867237.29666: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 13131 1726867237.29745: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000001b8 13131 1726867237.29757: variable 'ansible_search_path' from source: unknown 13131 1726867237.29786: calling self._execute() 13131 1726867237.29863: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867237.29868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867237.29879: variable 'omit' from source: magic vars 13131 1726867237.30151: variable 'ansible_distribution_major_version' from source: facts 13131 1726867237.30161: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867237.30165: _execute() done 13131 1726867237.30168: dumping result to json 13131 1726867237.30170: done dumping result, returning 13131 1726867237.30178: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affcac9-a3a5-5f24-9b7a-0000000001b8] 13131 1726867237.30182: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b8 13131 1726867237.30266: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000001b8 13131 1726867237.30269: WORKER PROCESS EXITING 13131 1726867237.30305: no more pending results, returning what we have 13131 1726867237.30309: in VariableManager get_vars() 13131 1726867237.30360: Calling all_inventory to load vars for managed_node1 13131 1726867237.30363: Calling groups_inventory to load vars for managed_node1 13131 1726867237.30365: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867237.30374: Calling all_plugins_play to load vars for managed_node1 13131 1726867237.30376: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867237.30381: Calling groups_plugins_play to load vars for managed_node1 13131 1726867237.31359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867237.32215: done with get_vars() 13131 1726867237.32228: variable 'ansible_search_path' from source: unknown 13131 1726867237.32237: we have included files to process 13131 1726867237.32237: generating all_blocks data 13131 1726867237.32239: done generating all_blocks data 13131 1726867237.32242: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13131 1726867237.32242: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13131 1726867237.32244: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13131 1726867237.32495: done processing included file 13131 1726867237.32497: iterating over new_blocks loaded from include file 13131 1726867237.32497: in VariableManager get_vars() 13131 1726867237.32513: done with get_vars() 13131 1726867237.32514: filtering new block on tags 13131 1726867237.32537: done filtering new block on tags 13131 1726867237.32538: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 13131 1726867237.32542: extending task lists for all hosts with included blocks 13131 1726867237.33199: done extending task lists 13131 1726867237.33200: done processing included files 13131 1726867237.33201: results queue empty 13131 1726867237.33201: checking for any_errors_fatal 13131 1726867237.33203: done checking for any_errors_fatal 13131 1726867237.33203: checking for max_fail_percentage 13131 1726867237.33204: done checking for max_fail_percentage 13131 1726867237.33205: checking to see if all hosts have failed and the running result is not ok 13131 1726867237.33205: done checking to see if all hosts have failed 13131 1726867237.33206: getting the remaining hosts for this loop 13131 1726867237.33207: done getting the remaining hosts for this loop 13131 1726867237.33208: getting the next task for host managed_node1 13131 1726867237.33211: done getting next task for host managed_node1 13131 1726867237.33212: ^ task is: TASK: Check routes and DNS 13131 1726867237.33214: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867237.33216: getting variables 13131 1726867237.33216: in VariableManager get_vars() 13131 1726867237.33228: Calling all_inventory to load vars for managed_node1 13131 1726867237.33229: Calling groups_inventory to load vars for managed_node1 13131 1726867237.33231: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867237.33234: Calling all_plugins_play to load vars for managed_node1 13131 1726867237.33235: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867237.33237: Calling groups_plugins_play to load vars for managed_node1 13131 1726867237.33928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867237.34754: done with get_vars() 13131 1726867237.34767: done getting variables 13131 1726867237.34795: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 17:20:37 -0400 (0:00:00.055) 0:00:52.458 ****** 13131 1726867237.34815: entering _queue_task() for managed_node1/shell 13131 1726867237.35010: worker is 1 (out of 1 available) 13131 1726867237.35024: exiting _queue_task() for managed_node1/shell 13131 1726867237.35035: done queuing things up, now waiting for results queue to drain 13131 1726867237.35036: waiting for pending results... 13131 1726867237.35196: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 13131 1726867237.35265: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000009f0 13131 1726867237.35280: variable 'ansible_search_path' from source: unknown 13131 1726867237.35283: variable 'ansible_search_path' from source: unknown 13131 1726867237.35312: calling self._execute() 13131 1726867237.35391: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867237.35397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867237.35405: variable 'omit' from source: magic vars 13131 1726867237.35673: variable 'ansible_distribution_major_version' from source: facts 13131 1726867237.35690: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867237.35695: variable 'omit' from source: magic vars 13131 1726867237.35730: variable 'omit' from source: magic vars 13131 1726867237.35754: variable 'omit' from source: magic vars 13131 1726867237.35785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867237.35813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867237.35829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867237.35842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867237.35852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867237.35874: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867237.35879: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867237.35882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867237.35951: Set connection var ansible_connection to ssh 13131 1726867237.35958: Set connection var ansible_timeout to 10 13131 1726867237.35960: Set connection var ansible_shell_type to sh 13131 1726867237.35967: Set connection var ansible_shell_executable to /bin/sh 13131 1726867237.35975: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867237.35981: Set connection var ansible_pipelining to False 13131 1726867237.35997: variable 'ansible_shell_executable' from source: unknown 13131 1726867237.36000: variable 'ansible_connection' from source: unknown 13131 1726867237.36003: variable 'ansible_module_compression' from source: unknown 13131 1726867237.36006: variable 'ansible_shell_type' from source: unknown 13131 1726867237.36010: variable 'ansible_shell_executable' from source: unknown 13131 1726867237.36012: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867237.36017: variable 'ansible_pipelining' from source: unknown 13131 1726867237.36024: variable 'ansible_timeout' from source: unknown 13131 1726867237.36027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867237.36128: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867237.36141: variable 'omit' from source: magic vars 13131 1726867237.36144: starting attempt loop 13131 1726867237.36147: running the handler 13131 1726867237.36152: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867237.36168: _low_level_execute_command(): starting 13131 1726867237.36174: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867237.36645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.36675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867237.36686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.36689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.36692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.36739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.36742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867237.36748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.36809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.38470: stdout chunk (state=3): >>>/root <<< 13131 1726867237.38563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.38595: stderr chunk (state=3): >>><<< 13131 1726867237.38597: stdout chunk (state=3): >>><<< 13131 1726867237.38616: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.38634: _low_level_execute_command(): starting 13131 1726867237.38638: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862 `" && echo ansible-tmp-1726867237.3862267-15695-112712811706862="` echo /root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862 `" ) && sleep 0' 13131 1726867237.39038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.39050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867237.39075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.39081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 13131 1726867237.39085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.39134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.39138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.39189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.41070: stdout chunk (state=3): >>>ansible-tmp-1726867237.3862267-15695-112712811706862=/root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862 <<< 13131 1726867237.41175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.41200: stderr chunk (state=3): >>><<< 13131 1726867237.41203: stdout chunk (state=3): >>><<< 13131 1726867237.41216: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867237.3862267-15695-112712811706862=/root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.41242: variable 'ansible_module_compression' from source: unknown 13131 1726867237.41281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867237.41310: variable 'ansible_facts' from source: unknown 13131 1726867237.41363: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/AnsiballZ_command.py 13131 1726867237.41461: Sending initial data 13131 1726867237.41464: Sent initial data (156 bytes) 13131 1726867237.41849: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.41881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867237.41884: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867237.41886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13131 1726867237.41890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867237.41892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.41940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867237.41943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.42003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.43550: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13131 1726867237.43555: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867237.43593: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867237.43640: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpkqtnoj4h /root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/AnsiballZ_command.py <<< 13131 1726867237.43646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/AnsiballZ_command.py" <<< 13131 1726867237.43685: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpkqtnoj4h" to remote "/root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/AnsiballZ_command.py" <<< 13131 1726867237.43689: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/AnsiballZ_command.py" <<< 13131 1726867237.44225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.44257: stderr chunk (state=3): >>><<< 13131 1726867237.44261: stdout chunk (state=3): >>><<< 13131 1726867237.44309: done transferring module to remote 13131 1726867237.44318: _low_level_execute_command(): starting 13131 1726867237.44323: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/ /root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/AnsiballZ_command.py && sleep 0' 13131 1726867237.44730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.44733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 13131 1726867237.44739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.44741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.44743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.44782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.44789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.44840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.46573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.46595: stderr chunk (state=3): >>><<< 13131 1726867237.46599: stdout chunk (state=3): >>><<< 13131 1726867237.46612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.46615: _low_level_execute_command(): starting 13131 1726867237.46618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/AnsiballZ_command.py && sleep 0' 13131 1726867237.47010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.47013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.47015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867237.47018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.47053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867237.47064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.47126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.63290: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3133sec preferred_lft 3133sec\n inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:20:37.620729", "end": "2024-09-20 17:20:37.629477", "delta": "0:00:00.008748", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867237.64791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 13131 1726867237.64796: stdout chunk (state=3): >>><<< 13131 1726867237.64804: stderr chunk (state=3): >>><<< 13131 1726867237.64941: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3133sec preferred_lft 3133sec\n inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:20:37.620729", "end": "2024-09-20 17:20:37.629477", "delta": "0:00:00.008748", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867237.64995: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867237.65003: _low_level_execute_command(): starting 13131 1726867237.65011: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867237.3862267-15695-112712811706862/ > /dev/null 2>&1 && sleep 0' 13131 1726867237.66235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867237.66393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.66433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.66699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867237.66723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.66804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.68627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.68641: stderr chunk (state=3): >>><<< 13131 1726867237.68665: stdout chunk (state=3): >>><<< 13131 1726867237.68688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.68694: handler run complete 13131 1726867237.68735: Evaluated conditional (False): False 13131 1726867237.68738: attempt loop complete, returning result 13131 1726867237.68741: _execute() done 13131 1726867237.68743: dumping result to json 13131 1726867237.68745: done dumping result, returning 13131 1726867237.68842: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0affcac9-a3a5-5f24-9b7a-0000000009f0] 13131 1726867237.68846: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000009f0 13131 1726867237.68987: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000009f0 13131 1726867237.68991: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008748", "end": "2024-09-20 17:20:37.629477", "rc": 0, "start": "2024-09-20 17:20:37.620729" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3133sec preferred_lft 3133sec inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 13131 1726867237.69072: no more pending results, returning what we have 13131 1726867237.69079: results queue empty 13131 1726867237.69081: checking for any_errors_fatal 13131 1726867237.69083: done checking for any_errors_fatal 13131 1726867237.69083: checking for max_fail_percentage 13131 1726867237.69085: done checking for max_fail_percentage 13131 1726867237.69086: checking to see if all hosts have failed and the running result is not ok 13131 1726867237.69087: done checking to see if all hosts have failed 13131 1726867237.69087: getting the remaining hosts for this loop 13131 1726867237.69089: done getting the remaining hosts for this loop 13131 1726867237.69092: getting the next task for host managed_node1 13131 1726867237.69100: done getting next task for host managed_node1 13131 1726867237.69102: ^ task is: TASK: Verify DNS and network connectivity 13131 1726867237.69109: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13131 1726867237.69119: getting variables 13131 1726867237.69121: in VariableManager get_vars() 13131 1726867237.69582: Calling all_inventory to load vars for managed_node1 13131 1726867237.69586: Calling groups_inventory to load vars for managed_node1 13131 1726867237.69589: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867237.69600: Calling all_plugins_play to load vars for managed_node1 13131 1726867237.69603: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867237.69609: Calling groups_plugins_play to load vars for managed_node1 13131 1726867237.72602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867237.75711: done with get_vars() 13131 1726867237.75734: done getting variables 13131 1726867237.76101: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 17:20:37 -0400 (0:00:00.413) 0:00:52.871 ****** 13131 1726867237.76138: entering _queue_task() for managed_node1/shell 13131 1726867237.76721: worker is 1 (out of 1 available) 13131 1726867237.76731: exiting _queue_task() for managed_node1/shell 13131 1726867237.76741: done queuing things up, now waiting for results queue to drain 13131 1726867237.76743: waiting for pending results... 13131 1726867237.77299: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 13131 1726867237.77388: in run() - task 0affcac9-a3a5-5f24-9b7a-0000000009f1 13131 1726867237.77408: variable 'ansible_search_path' from source: unknown 13131 1726867237.77420: variable 'ansible_search_path' from source: unknown 13131 1726867237.77462: calling self._execute() 13131 1726867237.77746: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867237.77761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867237.77779: variable 'omit' from source: magic vars 13131 1726867237.78534: variable 'ansible_distribution_major_version' from source: facts 13131 1726867237.78597: Evaluated conditional (ansible_distribution_major_version != '6'): True 13131 1726867237.78858: variable 'ansible_facts' from source: unknown 13131 1726867237.80406: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 13131 1726867237.80472: variable 'omit' from source: magic vars 13131 1726867237.80529: variable 'omit' from source: magic vars 13131 1726867237.80607: variable 'omit' from source: magic vars 13131 1726867237.80740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13131 1726867237.80820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13131 1726867237.80908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13131 1726867237.80931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867237.80948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13131 1726867237.81031: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13131 1726867237.81040: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867237.81048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867237.81485: Set connection var ansible_connection to ssh 13131 1726867237.81489: Set connection var ansible_timeout to 10 13131 1726867237.81491: Set connection var ansible_shell_type to sh 13131 1726867237.81493: Set connection var ansible_shell_executable to /bin/sh 13131 1726867237.81495: Set connection var ansible_module_compression to ZIP_DEFLATED 13131 1726867237.81497: Set connection var ansible_pipelining to False 13131 1726867237.81500: variable 'ansible_shell_executable' from source: unknown 13131 1726867237.81502: variable 'ansible_connection' from source: unknown 13131 1726867237.81504: variable 'ansible_module_compression' from source: unknown 13131 1726867237.81506: variable 'ansible_shell_type' from source: unknown 13131 1726867237.81508: variable 'ansible_shell_executable' from source: unknown 13131 1726867237.81509: variable 'ansible_host' from source: host vars for 'managed_node1' 13131 1726867237.81511: variable 'ansible_pipelining' from source: unknown 13131 1726867237.81513: variable 'ansible_timeout' from source: unknown 13131 1726867237.81516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13131 1726867237.81810: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867237.81813: variable 'omit' from source: magic vars 13131 1726867237.81816: starting attempt loop 13131 1726867237.81818: running the handler 13131 1726867237.81832: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13131 1726867237.81857: _low_level_execute_command(): starting 13131 1726867237.81927: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13131 1726867237.83230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.83297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.83412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.83451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.83559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.85150: stdout chunk (state=3): >>>/root <<< 13131 1726867237.85245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.85297: stderr chunk (state=3): >>><<< 13131 1726867237.85307: stdout chunk (state=3): >>><<< 13131 1726867237.85334: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.85359: _low_level_execute_command(): starting 13131 1726867237.85400: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383 `" && echo ansible-tmp-1726867237.853438-15728-224071403049383="` echo /root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383 `" ) && sleep 0' 13131 1726867237.86598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867237.86611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.86636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.86738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.86875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867237.86880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.86928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.88938: stdout chunk (state=3): >>>ansible-tmp-1726867237.853438-15728-224071403049383=/root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383 <<< 13131 1726867237.89013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.89105: stderr chunk (state=3): >>><<< 13131 1726867237.89149: stdout chunk (state=3): >>><<< 13131 1726867237.89172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867237.853438-15728-224071403049383=/root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.89584: variable 'ansible_module_compression' from source: unknown 13131 1726867237.89587: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13131ocdjuqyq/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13131 1726867237.89590: variable 'ansible_facts' from source: unknown 13131 1726867237.89701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/AnsiballZ_command.py 13131 1726867237.89820: Sending initial data 13131 1726867237.89890: Sent initial data (155 bytes) 13131 1726867237.91135: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867237.91150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867237.91292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.91355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867237.91449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867237.91468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.91550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.93165: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13131 1726867237.93206: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13131 1726867237.93249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpt9upuml_ /root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/AnsiballZ_command.py <<< 13131 1726867237.93253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/AnsiballZ_command.py" <<< 13131 1726867237.93416: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13131ocdjuqyq/tmpt9upuml_" to remote "/root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/AnsiballZ_command.py" <<< 13131 1726867237.94731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.94750: stderr chunk (state=3): >>><<< 13131 1726867237.94753: stdout chunk (state=3): >>><<< 13131 1726867237.94827: done transferring module to remote 13131 1726867237.94830: _low_level_execute_command(): starting 13131 1726867237.94833: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/ /root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/AnsiballZ_command.py && sleep 0' 13131 1726867237.96120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867237.96124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.96127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.96129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.96244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867237.96307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867237.98217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867237.98221: stderr chunk (state=3): >>><<< 13131 1726867237.98226: stdout chunk (state=3): >>><<< 13131 1726867237.98246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867237.98250: _low_level_execute_command(): starting 13131 1726867237.98255: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/AnsiballZ_command.py && sleep 0' 13131 1726867237.99482: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867237.99486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867237.99488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.99491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867237.99494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867237.99496: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867237.99682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.99685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13131 1726867237.99687: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 13131 1726867237.99689: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13131 1726867237.99691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867237.99693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867237.99695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867237.99697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867237.99699: stderr chunk (state=3): >>>debug2: match found <<< 13131 1726867237.99701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867237.99773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 13131 1726867237.99783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867238.00282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867238.22480: stdout chunk (state=3): >>> <<< 13131 1726867238.22508: stdout chunk (state=3): >>>{"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 14740 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 9844 0 --:--:-- --:--:-- --:--:-- 10034", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:20:38.150066", "end": "2024-09-20 17:20:38.222902", "delta": "0:00:00.072836", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13131 1726867238.24083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867238.24137: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 13131 1726867238.24141: stdout chunk (state=3): >>><<< 13131 1726867238.24148: stderr chunk (state=3): >>><<< 13131 1726867238.24301: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 14740 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 9844 0 --:--:-- --:--:-- --:--:-- 10034", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:20:38.150066", "end": "2024-09-20 17:20:38.222902", "delta": "0:00:00.072836", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 13131 1726867238.24341: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13131 1726867238.24349: _low_level_execute_command(): starting 13131 1726867238.24354: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867237.853438-15728-224071403049383/ > /dev/null 2>&1 && sleep 0' 13131 1726867238.25895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13131 1726867238.25906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13131 1726867238.25915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13131 1726867238.25929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13131 1726867238.25941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 13131 1726867238.25948: stderr chunk (state=3): >>>debug2: match not found <<< 13131 1726867238.25959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13131 1726867238.26056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 13131 1726867238.26283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13131 1726867238.26286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13131 1726867238.28283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13131 1726867238.28287: stdout chunk (state=3): >>><<< 13131 1726867238.28294: stderr chunk (state=3): >>><<< 13131 1726867238.28311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13131 1726867238.28318: handler run complete 13131 1726867238.28345: Evaluated conditional (False): False 13131 1726867238.28355: attempt loop complete, returning result 13131 1726867238.28358: _execute() done 13131 1726867238.28360: dumping result to json 13131 1726867238.28367: done dumping result, returning 13131 1726867238.28376: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0affcac9-a3a5-5f24-9b7a-0000000009f1] 13131 1726867238.28380: sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000009f1 ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.072836", "end": "2024-09-20 17:20:38.222902", "rc": 0, "start": "2024-09-20 17:20:38.150066" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 14740 0 --:--:-- --:--:-- --:--:-- 15250 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 9844 0 --:--:-- --:--:-- --:--:-- 10034 13131 1726867238.28572: no more pending results, returning what we have 13131 1726867238.28576: results queue empty 13131 1726867238.28580: checking for any_errors_fatal 13131 1726867238.28591: done checking for any_errors_fatal 13131 1726867238.28592: checking for max_fail_percentage 13131 1726867238.28593: done checking for max_fail_percentage 13131 1726867238.28594: checking to see if all hosts have failed and the running result is not ok 13131 1726867238.28595: done checking to see if all hosts have failed 13131 1726867238.28595: getting the remaining hosts for this loop 13131 1726867238.28597: done getting the remaining hosts for this loop 13131 1726867238.28600: getting the next task for host managed_node1 13131 1726867238.28616: done getting next task for host managed_node1 13131 1726867238.28619: ^ task is: TASK: meta (flush_handlers) 13131 1726867238.28621: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867238.28625: getting variables 13131 1726867238.28627: in VariableManager get_vars() 13131 1726867238.28866: Calling all_inventory to load vars for managed_node1 13131 1726867238.28869: Calling groups_inventory to load vars for managed_node1 13131 1726867238.28872: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867238.28885: Calling all_plugins_play to load vars for managed_node1 13131 1726867238.28888: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867238.28891: Calling groups_plugins_play to load vars for managed_node1 13131 1726867238.29584: done sending task result for task 0affcac9-a3a5-5f24-9b7a-0000000009f1 13131 1726867238.29587: WORKER PROCESS EXITING 13131 1726867238.32398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867238.37223: done with get_vars() 13131 1726867238.37255: done getting variables 13131 1726867238.37568: in VariableManager get_vars() 13131 1726867238.37593: Calling all_inventory to load vars for managed_node1 13131 1726867238.37595: Calling groups_inventory to load vars for managed_node1 13131 1726867238.37597: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867238.37602: Calling all_plugins_play to load vars for managed_node1 13131 1726867238.37604: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867238.37607: Calling groups_plugins_play to load vars for managed_node1 13131 1726867238.40236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867238.43561: done with get_vars() 13131 1726867238.43700: done queuing things up, now waiting for results queue to drain 13131 1726867238.43702: results queue empty 13131 1726867238.43703: checking for any_errors_fatal 13131 1726867238.43707: done checking for any_errors_fatal 13131 1726867238.43708: checking for max_fail_percentage 13131 1726867238.43709: done checking for max_fail_percentage 13131 1726867238.43709: checking to see if all hosts have failed and the running result is not ok 13131 1726867238.43710: done checking to see if all hosts have failed 13131 1726867238.43711: getting the remaining hosts for this loop 13131 1726867238.43712: done getting the remaining hosts for this loop 13131 1726867238.43715: getting the next task for host managed_node1 13131 1726867238.43719: done getting next task for host managed_node1 13131 1726867238.43721: ^ task is: TASK: meta (flush_handlers) 13131 1726867238.43722: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867238.43725: getting variables 13131 1726867238.43726: in VariableManager get_vars() 13131 1726867238.43747: Calling all_inventory to load vars for managed_node1 13131 1726867238.43750: Calling groups_inventory to load vars for managed_node1 13131 1726867238.43752: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867238.43757: Calling all_plugins_play to load vars for managed_node1 13131 1726867238.43759: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867238.43762: Calling groups_plugins_play to load vars for managed_node1 13131 1726867238.46143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867238.49342: done with get_vars() 13131 1726867238.49481: done getting variables 13131 1726867238.49532: in VariableManager get_vars() 13131 1726867238.49554: Calling all_inventory to load vars for managed_node1 13131 1726867238.49556: Calling groups_inventory to load vars for managed_node1 13131 1726867238.49558: Calling all_plugins_inventory to load vars for managed_node1 13131 1726867238.49563: Calling all_plugins_play to load vars for managed_node1 13131 1726867238.49565: Calling groups_plugins_inventory to load vars for managed_node1 13131 1726867238.49568: Calling groups_plugins_play to load vars for managed_node1 13131 1726867238.52002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13131 1726867238.55297: done with get_vars() 13131 1726867238.55325: done queuing things up, now waiting for results queue to drain 13131 1726867238.55327: results queue empty 13131 1726867238.55328: checking for any_errors_fatal 13131 1726867238.55444: done checking for any_errors_fatal 13131 1726867238.55445: checking for max_fail_percentage 13131 1726867238.55447: done checking for max_fail_percentage 13131 1726867238.55448: checking to see if all hosts have failed and the running result is not ok 13131 1726867238.55448: done checking to see if all hosts have failed 13131 1726867238.55449: getting the remaining hosts for this loop 13131 1726867238.55450: done getting the remaining hosts for this loop 13131 1726867238.55453: getting the next task for host managed_node1 13131 1726867238.55457: done getting next task for host managed_node1 13131 1726867238.55457: ^ task is: None 13131 1726867238.55459: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13131 1726867238.55460: done queuing things up, now waiting for results queue to drain 13131 1726867238.55461: results queue empty 13131 1726867238.55462: checking for any_errors_fatal 13131 1726867238.55462: done checking for any_errors_fatal 13131 1726867238.55463: checking for max_fail_percentage 13131 1726867238.55464: done checking for max_fail_percentage 13131 1726867238.55465: checking to see if all hosts have failed and the running result is not ok 13131 1726867238.55465: done checking to see if all hosts have failed 13131 1726867238.55468: getting the next task for host managed_node1 13131 1726867238.55470: done getting next task for host managed_node1 13131 1726867238.55471: ^ task is: None 13131 1726867238.55472: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=108 changed=5 unreachable=0 failed=0 skipped=121 rescued=0 ignored=0 Friday 20 September 2024 17:20:38 -0400 (0:00:00.794) 0:00:53.666 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.02s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.94s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.88s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.82s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.52s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.47s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.07s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install dnsmasq --------------------------------------------------------- 1.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.03s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.99s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.91s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.90s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.88s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.84s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.80s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 0.79s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.79s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 13131 1726867238.55814: RUNNING CLEANUP